You are on page 1of 64

ADMINISTERING,

ANALYZING, & IMPROVING


TESTS

Prepared by:
FERNANDO R. SEQUETE JR., LPT
LEARNING OUTCOMES
At the end of the chapter, you should be able to:
1. Define the basic concepts regarding item analysis;
2. Identify the steps in improving test items;
3. Solve difficulty index and discrimination index;
4. Identify the level of difficulty of an item;
5. Perform item analysis properly and correctly;
6. Identify the item to be rejected, revised, or retained; and
7. Interpret the results of item analysis.
INTRODUCTION
•The teacher normally prepares a draft of the test.
•The draft is subjected to item analysis and validation to be useful
and functional.
•Try-out phase – the draft test will be tried out to a group of
students of similar characteristics to the test takers.
•Item analysis phase – each item will be analyzed in terms of its
ability to discriminate, and also its level of difficulty.
•Item revision phase – allow the teacher to decide whether to
revise or replace an item.
INTRODUCTION
•The final draft of the test is subjected to validation if the intent is
to make use of the test as a standard test for the particular unit
or grading period
PACKAGING AND REPRODUCING TEST ITEMS
✓Put the items with the same format together.
✓Arrange the test items from easy to difficult.
✓Give proper spacing for each item for easy reading.
✓Keep questions and options in the same page.
✓Place the illustrations near the options.
✓Check the key answer.
✓Check the direction of the test.
✓Provide space for name, date, and score.
✓Proofread the test.
✓Reproduce the test.
ADMINISTERING THE EXAMINATION
✓The administration procedures greatly affect the
performance of the students in the test.
✓The test administration does not simply mean giving the
test questions to the students and collecting the test
papers after the given time.
ADMINISTERING THE EXAMINATION
Guidelines BEFORE Administering Examinations
1. Try to induce positive test-taking attitude.
2. Inform the students about the purpose of the test.
3. Give oral directions as early as possible before distributing the tests.
4. Give test-taking hints about guessing, skipping, and the like, are strictly
prohibited.
5. Inform the students about the length of time allowed for the test. If
possible, write on the board the time in which they must be finished with
answering the test. Give the students a warning before the end of the
time limit.
ADMINISTERING THE EXAMINATION
Guidelines BEFORE Administering Examinations
6. Tell the students how to signal or call your attention if they have a
question.
7. Tell the students what to do with their papers when they are done
answering the test (how papers are to be collected).
8. Tell the students what to do when they are done with the test,
particularly if they are to go on to another activity (also write these
directions on the chalkboard so they can refer to them).
9. Rotate the method of distributing papers so you don’t always start from
the left or the front row.
ADMINISTERING THE EXAMINATION
Guidelines BEFORE Administering Examinations
10. Make sure the room is well lighted and has a comfortable temperature.
11. Remind students to put their names on their papers (and where to do so).
12. If the test has more than one page, have each student checked to see
that all pages are there.
ADMINISTERING THE EXAMINATION
Guidelines DURING the Examination
1. Do not give instructions or avoid talking while the examination is going
on to minimize interruptions and distractions.
2. Avoid giving hints.
3. Monitor to check student progress and discourage cheating.
4. Give time warnings if students are pacing their work appropriately.
5. Make a note of any question's students ask during the test so that items
can be revised for future use.
6. Test papers must be collected uniformly to save time and to avoid test
papers to be misplaced.
ADMINISTERING THE EXAMINATION
Guidelines AFTER the Examination
1. Grade the papers (and add comments if you can); do test
analysis (see the module on test analysis) after scoring
and before returning papers to students if at all possible.
If it is impossible to do your test analysis before returning
the papers, be sure to do it at another time. It is important
to do both the evaluation of your students and the
improvement of your tests.
ADMINISTERING THE EXAMINATION
Guidelines AFTER the Examination
2. If you are recording grades or scores, record them
in pencil in your class record before returning the
papers. If there are errors/ adjustments in
grading, they (grades) are easier to change when
recorded in pencil.
3. Return papers in a timely manner.
ADMINISTERING THE EXAMINATION
Guidelines AFTER the Examination
4. Discuss test items with the students. If students have questions, agree to look
over their papers again, as well as the papers of others who have the same
question. It is usually better not to agree to make changes in grades on the
spur of the moment while discussing the tests with the students but to give
yourself time to consider what action you want to take. The test analysis may
have already alerted you to a problem with a particular question that is
common to several students, and you may already have made a decision
regarding that question (to disregard the question and reduce the highest
possible score accordingly, to give all students credit for that question,
among others).
ANALYZING THE TEST
❑ After administering and scoring the test, the teacher should also analyze
the quality of each item in the test. Through this we can identify the item
that is good, item that needs improvement or items to be removed from
the test.
❑ But when do we consider that the test is good?
❑ How do we evaluate the quality of each item in the test?
❑ Why is it necessary to evaluate each item in the test?
❑ Lewis Aiken (1997) an author of psychological and educational
measurement pointed out that a “postmortem” is just as necessary in
classroom assessment as it is in medicine.
ANALYZING THE TEST
❑We shall introduce the technique to help teachers determine the
quality of a test item known as item analysis.
❑One of the purposes of item analysis is to improve the quality of
the assessment tools. Through this process, we can identify the
item that is to be retained, revised, or rejected, and the content
of the lesson that is mastered or not.
❑There are two kinds of item analysis, quantitative item analysis
& quantitative item analysis (Kubiszyn and Borich, 2007).
ITEM ANALYSIS
❑Item analysis is a process of examining the student’s
response to individual item in the test.
❑Through the use of item analysis, we can identify
which of the given are good and defective test items.
❑Good items are to be retained and defective items
are to be improved, to be revised or to be rejected.
ITEM ANALYSIS
Use of Item Analysis:
▪ Item analysis data provide a basis for efficient class discussion of the
test results.
▪ Item analysis data provide a basis for remedial work.
▪ Item analysis data provide a basis for general improvement of
classroom instruction.
▪ Item analysis data provide a basis for increased skills in test
construction.
▪ Item analysis procedures provide a basis for constructing test bank.
TYPES OF QUANTITATIVE ITEM ANALYSIS
1. Difficulty Index
▪ It refers to the proportion of the number of students in the upper and lower groups
who answered an item correctly.
▪ The larger the proportion, the more students, who have learned the subject is
measured by the item.
▪ To compute the difficulty index of an item, use the formula:
𝑛
𝐷𝐹 = 𝑁, where
𝐷𝐹 = 𝑑𝑖𝑓𝑓𝑖𝑐𝑢𝑙𝑡𝑦 𝑜𝑓 𝑖𝑛𝑑𝑒𝑥
n = number of the students selecting item correctly in the upper group and in the lower
group
N = total number of students who answered the test
TYPES OF QUANTITATIVE ITEM ANALYSIS
Level of Difficulty
▪ To determine the level of difficulty of an item, find first the difficulty index using the
formula and identify the level of difficulty using the range given below.
▪ The higher the value of the index of difficulty, the easier the item is. Hence, more
students got the correct answer, and more students mastered the content measured by
that item.
Index Range Difficulty Level
0.00-0.20 Very difficult
0.21-0.40 Difficult
0.41-0.60 Average/Moderately difficult
0.61-0.80 Easy
0.81-1.00 Very easy
TYPES OF QUANTITATIVE ITEM ANALYSIS
2. Discrimination Index
▪ The power of the item to discriminate the students between those who scored high and
those who scored low in the overall test.
▪ It is the power of the item to discriminate the students who know the lesson and those
who do not know the lesson.
▪ It also refers to the number of students in the upper group who got an item correctly
minus the number of students in the lower group who got an item correctly. Divide the
difference by either the number of the students in the upper group or number of
students in lower group or get the higher number if they are not equal.
▪ It is the basis of measuring the validity of an item.
▪ It can be interpreted as an indication of the extent to which overall knowledge of the
content area or mastery of the skills is related to the response on an item.
TYPES OF QUANTITATIVE ITEM ANALYSIS
Types of Discrimination Index
1. Positive discrimination happens when more students in the upper
group got the item correctly than those students in the lower group.
2. Negative discrimination occurs when more students in the lower
group got the item correctly than the students in the upper group.
3. Zero discrimination happens when a number of students in the upper
group and lower group who answer the test correctly are equal,
hence, the test item cannot distinguish the students who performed in
the overall test and the students whose performance are very poor.
TYPES OF QUANTITATIVE ITEM ANALYSIS
Level of Discrimination – Ebel and Frisbie (1986) as cited by Hetzel
(1997) recommended the use of Level of Discrimination of an Item for
easier interpretation.
Index Range Discrimination Level

0.19 and below Poor item, should be eliminated or need to be revised

0.20-0.29 Marginal item, needs some revision

0.30-0.39 Reasonably good item but possibly for improvement

0.40 and above Very good item


TYPES OF QUANTITATIVE ITEM ANALYSIS
Discrimination Index Formula

𝐶𝑈𝐺 −𝐶𝐿𝐺
𝐷𝐼 = ,
where
𝐷
𝐷𝐼 =discrimination index value
𝐶𝑈𝐺 = number of the students selecting the correct answer in the upper group
𝐶𝐿𝐺 = number of the students selecting the correct answer in the lower group.
D = number of students in either the lower group or upper group.

Note: Consider the higher number in case the sizes in upper and lower group
are not equal.
TYPES OF QUANTITATIVE ITEM ANALYSIS
Steps in Solving Difficulty Index and Discrimination Index
1. Arrange the scores from highest to lowest.
2. Separate the scores into upper group and lower group. There are different
methods to do this: (a) if a class consists of 30 students who takes an exam,
arrange their scores from highest to lowest, then divide them into two
groups. The highest score belongs to the upper group. The lowest score
belongs to the lower group; and (b) other literatures suggested to use
27%, 30%, or 33% of the students for the upper group and lower group.
However, in the Licensure Examination for Teachers (LET) the test
developers always used 27% of the students who participated in the
examination for the upper and lower groups.
TYPES OF QUANTITATIVE ITEM ANALYSIS
Steps in Solving Difficulty Index and Discrimination Index
3. Count the number of those who chose the alternatives in the upper and
lower group for each item and record the information using the template:
Options A B C D E
Upper Group
Lower Group
Note: Put asterisk for the correct answer.
4. Compute the value of the difficulty index and the discrimination index and
also the analysis of each response in the distracters.
5. Make an analysis for each item.
TYPES OF QUANTITATIVE ITEM ANALYSIS
Checklist for Discrimination Index
It is very important to determine whether the test item will be retained, revised or
rejected. Using the Discrimination Index we can identify the non-performing question
items; just always remember that they seldom indicate what is the problem. Use the given
checklist below:
Yes No
1. Does the key discriminate positively?
2. Do the incorrect options discriminate negatively?

• If the answers to questions 1 and 2 are both YES, retain the item.
• If the answers to questions 1 and 2 ae either YES or NO, revise the item.
• If the answers to questions 1 and 2 are both NO, eliminate or reject the item.
TYPES OF QUANTITATIVE ITEM ANALYSIS
3. Analysis of Response Options
▪ Another way to evaluate the performance of the entire test item is through
the analysis of the response options.
▪ It is very important to examine the performance of each option in multiple-
choice item.
▪ You can determine whether the distracters pr incorrect options are effective
or attractive to those who do not know the correct answer.
▪ The attractiveness of the incorrect options is determined when more students
in the lower group than in the upper group choose it.
▪ Analyzing the incorrect options allows the teachers to improve the test items
so that it can be used again in the future.
TYPES OF QUANTITATIVE ITEM ANALYSIS
Distracter Analysis
1. Distracter
▪ It is the term used for the incorrect options in the multiple-choice type of test
while the correct answer represents the key.
▪ Using quantitative item analysis, we can determine if the options are good
or if the distracters are effective.
▪ Item analysis can identify non-performing test items, but this item seldom
indicates the error or the problem in the given item. There are factors to be
considered why students failed to get the correct answer in the given
question.
TYPES OF QUANTITATIVE ITEM ANALYSIS
Distracter Analysis
1. Distracter
a) It is not taught in the class properly.
b) It is ambiguous.
c) The correct answer is not in the given options.
d) It has more than one correct answer.
e) It contains grammatical clues to mislead the students.
f) The student is not aware of the content.
g) The students were confused by the logic of the question because it has
double negatives.
h) The student failed to study the lesson.
TYPES OF QUANTITATIVE ITEM ANALYSIS
Distracter Analysis
2. Miskeyed item – the test item is a potential miskey if there are more students
from the upper group who choose the incorrect options than the key.
3. Guessing item – students from the upper group have equal spread of choices
among the given alternatives. Students from the upper group guess their
answers because of the following reasons:
a. the content of the test is not discussed in the class or in the text
b. the test item is very difficult
c. the question us trivial
4. Ambiguous item – this happens when more students from the upper group
choose equally an incorrect option and the keyed answer.
QUALITATIVE ITEM ANALYSIS
❑ Qualitative item analysis (Zurawski, R.M) is a process in which the teacher or
expert carefully proofreads the test before it is administered, to check if
there are typographical errors, to avoid grammatical clues that may lead to
giving away the correct answer, and to ensure that the level of reading
materials is appropriate.
❑ These procedures can also include small group discussions on the quality of
the examination and its items, with examinees that have already took the
test.
❑ According to Cohen, Swerdlik, and Smith (1992) as cited by Zurawski,
students who tool the examination are asked to express verbally their
experience in answering each item in the examination.
IMPROVING TEST ITEMS
❑Item analysis enables the teachers to improve
and enhance their skills in writing test items.
❑To improve multiple-choice test items, we shall
consider the stem of the item, the distracters
and the key answer.
EXAMPLE #1
A class is composed of 40 students. Divide the group into
two. Option B is the correct answer. Based on the given
data on the table, as a teacher, what would you do with
the test item?
Options A B* C D E
Upper Group 3 10 4 0 3
Lower Group 4 4 8 0 4
EXAMPLE #1
Compute the difficulty index. Compute the discrimination index.

𝑛 = 10 + 4 = 14 𝐶𝑈𝐺 = 10
N = 40 𝐶𝐿𝐺 = 4
𝑛 D = 20
𝐷𝐹 = 𝐶 −𝐶
𝑁 𝐷𝐼 = 𝑈𝐺 𝐿𝐺
14 𝐷
𝐷𝐹 = =
10−4
40 20
𝐷𝐹 = 0.35 𝑜𝑟 35% 6
=
20
=0.30 or 30%
EXAMPLE #1
Make an analysis about the level of difficulty, discrimination and
distracters.
a) Only 35% of the examinees got the answer correctly, hence, the
item is difficult.
b) More students from the upper group got the answer correctly,
hence, it has a positive discrimination.
c) Retain options A, C, and E because most of the students who did
not perform well in the overall examination selected it. Those
options attract most students from the lower group.
EXAMPLE #1

Conclusion: Retain the test item but change


option D, make it more realistic to make it
effective for the upper and lower groups.
At least 5% of the examinees choose the
incorrect option.
EXAMPLE #2
A class is composed of 50 students. Use 27% to get the
upper and the lower groups. Analyze the item given the
following results. Option D is the correct answer. What will
you do with the test item?
Options A B C D* E
Upper Group (27%) 3 1 2 6 2
Lower Group (27%) 5 0 4 4 1
EXAMPLE #2
Compute the difficulty index. Compute the discrimination index.

𝑛 = 6 + 4 = 10 𝐶𝑈𝐺 = 6
N = 50 𝐶𝐿𝐺 = 4
𝑛 D = 14
𝐷𝐹 = 𝐶 −𝐶
𝑁 𝐷𝐼 = 𝑈𝐺 𝐿𝐺
10 𝐷
𝐷𝐹 = =
6−4
50 14
𝐷𝐹 = 0.20 𝑜𝑟 20% 2
=
14
=0.14 or 14%
EXAMPLE #2
Make an analysis.
a) Only 20% of the examinees got the answer correctly, hence, the items are very
difficult.
b) More students from the upper group got the answer correctly, hence, it has a positive
discrimination.
c) Modify options B and E because more students from the upper group chose them
compare with the lower group, hence, they are not effective distracters because most
of the students who performed well in the overall examination selected them as their
answers.
d) Retain options A and C because most of the students who did not perform well in the
overall examination selected them as the correct answers. Hence, options A and C
are effective distracters.
EXAMPLE #2

Conclusion: Revised the item by


modifying options B and E.
EXAMPLE #3
A class is composed of 50 students. Use 27% to get the
upper and the lower groups. Analyze the item given the
following results. Option E is the correct answer. What will
you do with the test item?
Options A B C D E*
Upper Group (27%) 2 3 2 2 5
Lower Group (27%) 2 2 1 1 8
EXAMPLE #3
Compute the difficulty index. Compute the discrimination index.

𝑛 = 5 + 8 = 13 𝐶𝑈𝐺 = 5
N = 50 𝐶𝐿𝐺 = 8
𝑛 D = 14
𝐷𝐹 = 𝐶 −𝐶
𝑁 𝐷𝐼 = 𝑈𝐺 𝐿𝐺
13 𝐷
𝐷𝐹 = =
5−8
50 14
𝐷𝐹 = 0.26 𝑜𝑟 26% −3
=
14
=-0.21 or -21%
EXAMPLE #3
Make an analysis.
a) 26% of the students got the answer to test item correctly, hence, the test
item is difficult.
b) More students from the lower group got the item correctly, therefore, it is a
negative discrimination. The discrimination index is -21%.
c) No need to analyze the distracters because the item discriminates
negatively.
d) Modify all the distracters because they are not effective. Most of the
students in the upper group chose the incorrect options. The options are
effective if most of the students in the lower group chose the incorrect
options.
EXAMPLE #3

Conclusion: Reject item because


it has a negative discrimination
index.
EXAMPLE #4
Potential Miskeyed Item. Make an item analysis about the
table below. What will you do with the test that is a
potential Miskeyed item?

Options A* B C D E
Upper Group 1 2 3 10 4
Lower Group 3 4 4 4 5
EXAMPLE #4
Compute the difficulty index. Compute the discrimination index.

𝑛 =1+3=4 𝐶𝑈𝐺 = 1
N = 40 𝐶𝐿𝐺 = 3
𝑛 D = 20
𝐷𝐹 = 𝐶 −𝐶
𝑁 𝐷𝐼 = 𝑈𝐺 𝐿𝐺
4 𝐷
𝐷𝐹 = =
1−3
40 20
𝐷𝐹 = 0.10 𝑜𝑟 10% −2
=
20
=-0.10 or -10%
EXAMPLE #4
Make an analysis.
a)More students from the upper group choose option D than
option A, even though option A is supposedly the correct
answer.
b)Most likely the teacher has written the wrong answer key.
c) The teacher checks and finds out that he/she did not miskey the
answer that he/she thought is the correct answer.
d)If the teacher Miskeyed it, he/she must check and retally the
scores of the students’ test papers before giving them back.
EXAMPLE #4
Make an analysis.
e) If option A is really the correct answer, revise to weaken option D,
distracters are not supposed to draw more attention than the keyed
answer.
f) Only 10% of the students got the answer to the test item correctly,
hence, the test item is very difficult.
g) More students from the lower group got the item correctly, therefore a
negative discrimination resulted. The discrimination index is -10%.
h) No need to analyze the distracters because the test item is very
difficult and discriminates negatively.
EXAMPLE #4

Conclusion: Reject the item


because the test item is very
difficult and has a negative
discrimination index.
EXAMPLE #5
Ambiguous Item. Below is the result of item analysis of a
test with an ambiguous test item. What can you say about
the item? Are you going to retain, revise or reject it?

Options A B C D E*
Upper Group 7 1 1 2 8
Lower Group 6 2 3 3 6
EXAMPLE #5
Compute the difficulty index. Compute the discrimination index.

𝑛 = 8 + 6 = 14 𝐶𝑈𝐺 = 8
N = 39 𝐶𝐿𝐺 = 6
𝑛 D = 20
𝐷𝐹 = 𝐶 −𝐶
𝑁 𝐷𝐼 = 𝑈𝐺 𝐿𝐺
14 𝐷
𝐷𝐹 = =
8−6
39 20
𝐷𝐹 = 0.36 𝑜𝑟 36% 2
=
20
= 0.10 or 10%
EXAMPLE #5
Make an analysis.
a) Only 36% of the students got the answer to the test item correctly,
hence, the test item is difficult.
b) More students from the upper group got the item correctly, hence,
it discriminates positively. The discrimination index is 10%.
c) About equal numbers of top students went for option A and option
E, this implies that they could not tell which is the correct answer.
The students do not know the content of the test, thus, a reteach is
needed.
EXAMPLE #5

Conclusion: Review the test item


because it is ambiguous.
EXAMPLE #6
Guessing Item. Below is the result of an item analysis for a
test item with students’ answers mostly based on a guess.
Are you going to reject, revise or retain the test item?

Options A B C* D E
Upper Group 4 3 4 3 6
Lower Group 3 4 3 4 5
EXAMPLE #6
Compute the difficulty index. Compute the discrimination index.

𝑛 =4+3=7 𝐶𝑈𝐺 = 4
N = 39 𝐶𝐿𝐺 = 3
𝑛 D = 20
𝐷𝐹 = 𝐶 −𝐶
𝑁 𝐷𝐼 = 𝑈𝐺 𝐿𝐺
7 𝐷
𝐷𝐹 = =
4−3
39 20
𝐷𝐹 = 0.18 𝑜𝑟 18% 1
=
20
= 0.05 or 5%
EXAMPLE #6
Make an analysis.
a) Only 18% of the students got the answer to the test item correctly,
hence, the test item is very difficult.
b) More students from the upper group got the correct answer to the
test item; therefore, the test item is a positive discrimination. The
discrimination index is 5%.
c) Students respond about equally to all alternatives, an indication that
they are guessing.
d) If the test item is well-written but too difficult, reteach the material to
the class.
EXAMPLE #6
Make an analysis.
Three possibilities why student guesses the answer on a test item:
▪ The content of the test item has not yet been discussed in the
class because the test is designed in advance;
▪ Test items were badly written that students have no idea what
the question is really about; and
▪ Test items were very difficult as shown from the difficulty index
and low discrimination index.
EXAMPLE #6

Conclusion: Reject the item because it is


very difficult and the discrimination
index is very poor, and options A and
B are not effective distracters.
EXAMPLE #7
The table below shows an item analysis of a test item with
ineffective distracters. What can you conclude about the
test item?

Options A B C* D E
Upper Group 5 3 9 0 3
Lower Group 6 4 6 0 4
EXAMPLE #7
Compute the difficulty index. Compute the discrimination index.

𝑛 = 9 + 6 = 15 𝐶𝑈𝐺 = 9
N = 40 𝐶𝐿𝐺 = 6
𝑛 D = 20
𝐷𝐹 = 𝐶 −𝐶
𝑁 𝐷𝐼 = 𝑈𝐺 𝐿𝐺
15 𝐷
𝐷𝐹 = =
9−6
40 20
𝐷𝐹 = 0.38 𝑜𝑟 38% 3
=
20
= 0.15 or 15%
EXAMPLE #7
Make an analysis.
a) Only 38% of the students got the answer to the test item
correctly, hence, the test item is difficult.
b) More students from the upper group answered the test item
correctly; as a result, the test got a positive discrimination. The
discrimination index is 15%.
c) Options A, B, and E are attractive and effective distracters.
d) Option D is ineffective; therefore, change it with more realistic
one.
EXAMPLE #7

Conclusion: Revise the item


by changing option D.

You might also like