You are on page 1of 7

Question 1: Analysis of Objective and Subjective items

PERFORMANCE ANALYSIS 0F 5
BISTARI CLASS FOR
MATHEMATICS
MONTHLY TEST, TAHUN 2021

PLACEMENT OBJECTIVE ITEM (10 MARKS )


No SUBJECTIVE ITEMS
NAME OF POSITION ITEM NO NOTES
.
IN CLASS 1 2 3 4 5 6 7 8 9 1 1(a) 1(b 2(a) 2(b)
10 ) (10m (5m
0
m 5m ) )
1 ADILAH FARAH 1 / / / / / / X / / / 10 5 10 5 39
2 AISYAH FARHANA 2 / / / / / / X / / / 9 5 10 5 38
3 AMNI BATRISYIA 3 / / X / / / X / / X 8 5 10 4 34
4 AUNI AFIFAH 4 / / / / X / X / / X 8 5 4 2 26
5 BENN COLLIN LINUS 5 / X / / X / / / / X 7 5 7 4 30
6 CARMEN EDWARD 6 / / X / X x X / / X 9 5 3 3 25
7 DAISY JUNE 7 / X / / X / X / / X 9 5 4 4 28
8 DAYANGKU MIRSHA 8 X X / / X x / / / X 9 4 3 3 24
9 FARAH SYAZRINA 9 X X / / X x X / / X 9 4 2 3 22
10 FATIN NUR 10 / X / / X x / / / X 9 5 2 3 25
11 FRESCILLA ANDREA 11 X X / / X x X / / X 9 4 6 3 27
12 HANI NASYUHA 12 / X X / X x X / / X 7 5 6 3 25
13 JACQUETTA DABRIA 13 X X / / X / X / / X 7 3 3 3 21
14 JACYNTA MARK 14 / / / / X x X / / X 9 3 3 5 26
15 JESSIE JANIUS 15 / X / / X x / / / X 9 2 3 4 24
16 JOE EZRA DUANIK 16 / X / / X x / / / X 9 5 5 1 26
17 KIMBERLY MIKE 17 / X / / X x / / / X 9 5 5 3 28
18 KRISJAIR JOSEPH 18 / X X / X / / / / X 5 2 3 3 19
19 KRISTA WONG 19 / X X / X x / X / X 5 2 0 4 15
20 LARA ALEX BRUNEI 20 / X X / X x / X / X 5 2 0 3 14
21 WELLA WONG 21 X / X X X x / X / X 5 1 0 3 12
22 AIN AZ ZAHRA 22 X X X X X x / X / X 5 1 0 2 10
23 ELLRIENA HILSA 23 X / X X / x / / x X 3 0 0 5 12
24 MUHAMMAD FAIQ 24 X X X X X x X X x X 3 2 0 2 7
25 UMI SUMAYYAH 25 / X X X X x / X x X 0 0 0 1 3
26 IVY YONG 26 X / X / / X X / / X 0 2 0 1 8
27 WONG CHIN HI 27 X / X / X X X / / X 1 1 0 1 7
28 ZAINAL AHMAD 28 X X X / X X X / / / 1 2 0 1 8
29 MOHD ZAINAL 29 X X X X X X X / X / 1 0 0 1 4
30 MELVIN YONG 30 X X X X X X X / / / 0 0 0 0 3
Total Correct Answer 17 5 26 5 180 90 89 85
NT 3 5 2
NR 0 1 2
p 0.17 0.87 0.17 0.6 0.6 0.3 0.6
DI

/ CORRECT
ANSWER
X INCORRECT
ANSWER

TABLE 1. Analysis of the Monthly Mathematics Test of 5 Bistari Class

1|Page
Difficulty Index Level
Objective Item
The dificulty index for Item 9 is it can be answered correctly by 26 out of 30 students. It shows the
difficulty index is easy as calculated below:
p = 26/30
= 0.87#
The higher the value, the easier the item. If almost all of the students get the answer correct, it is a
concept probably not worth testing.
However for both Item 5 and Item 10, they can be only answered correctly by 5 out of 30 students.
These shows the difficulty index is hard as calculated below:
p = 5/30
= 0.17#
p value below 0.20 are very difficult items and should be reviewed for possible confusing language
or removed from subsequent test. If almost all of the students get the item wrong there is either a
problem with the item or students did not get the concept.

Subjective Item
For both Item 1(a) and Item 1(b), their Difficulty Index are 0.6.
p = mean score / full score range

Item 1(a)
p = mean score (180) / full score range (10)
p = 0.6 = 60%

However for Item 2(a)


p = mean score (89) / full score range (10)
p = 0.3 = 30%

The difficulty index level for subjective items are in the range of d 30-70% that is moderately
difficult to moderately easy and it is highly recommended to be used in the item constructions.

Discrimination Index Level


(DI)=Point-Biseral correlation (PBS)

Discrimination Index (D) Scale


Value Classification Remarks
0.40 and above Very good item Accepted
Reasonably good item but
0.30 – 0.39 possibly subject for Accepted
improvement
0.20 – 0.29 Marginal Item, subject to
Accepted
improve
Below 0.20 Poor item To be revised
Table1.1X: Discrimination Index (D) Scale

The higher the value, the more discriminating the item.


Item 5
D = (3-0) / 15
= 0.2
As stated on Table 1.1X, Item 5 is poorly created and it needed to be revised.

Item 9
D = (5-1) / 15
= 0.3
As stated on Table 1.1X, Item 9 need a room of improvement.
Question 2: The effectiveness of the distractor found in the Objective Item based on the
response of students who have answered the objective question.

Item No 6 A B* C B
High
Performing 0 6 1 1
Pupils
Low
Performing 0 0 7 1
Pupils

TOTAL 0 6 8 2

*Correct Answer (Key Answer)


(TABLE 2: Pupil’s Respond to Item No. 6)

Table 2 illustrates how difficult of an item analysis is revealed and how well it discriminates strong
and weaker learners in class. Scores are more reliable than subjectively scored items. A broad
sample of achievement can be measured and performance can be compared from class to class.
Distractors are created so close to the correct answer that they may confuse learners who really
know the answer to the question. ”Distractors should differ from the key in a substantial way, not
just in some minor nuances of phrasing or emphasis.” (Isaacs, 1994). It places a high degree of
independence of learners’ reading ability and the constructors’ writing ability. Thus, its learning
outcomes from simple to complex can be measured.
Question 3: Strengths and Weaknesses found in the example of objective questions given in
terms of aspects of Item Writing. (Objective Items - Multiple Choice).

Item Quantity Price


Cap 1 RM12.50
Shirt 2

6. The table above, which is incomplete, shows the items purchased by Raju. Raju has
RM345.00 and after buying the above items. The balance he has is RM180.00.

What is the price of the shirt he bought?

A. RM55.00
B. RM76.25
C. RM82.50
D. RM152.50

Test writing is a profession. By that we mean that good test writers are professionaly trained in
designing test items. The correct answer is called keyed response and the incorrect options are
called distractors.
To begin with, Item 6 have shown its strengths and weaknesses as the students’ answer can be
analysed from high performing students to the under-performing students. The distractors have
played their role to affect the result of the students.

i. Strengths
The stem are in the form of completed question. It expressed clearly and concisely instead of
showing ambiguity and complex syntax. It is also generally ask for one answer only. The sentence
structures are used only plausible and attractive alternatives as distractors. Note that C and D are
serious distractors. Moreover, the questions are
 presenting practical and real-world situation to learner. In Taxonomy’s, it relates with
Evaluation : the students’ judgment
 presenting a single and definite statement to be completed or answered by one of the several
given choices.
 using clear tables and figure that require interpretation. In short, Bloom’s Taxonomy for thinking
is Problem Solving that is Analysis : Breaking things down critical thinking.
 using clear, straight forward language in the stem on the item. In Taxonomy, it relates with
Synthesis : Putting things together – creative thinking.

ii. Weaknesses
 In the poor item, both options B and D could be considered to be correct.
 Distractors can be influenced by reading ability. There is a lack of feedback on individual
thought processes and it is difficult to determine why individual learners select incorrect
responses.
 The probability of the answers are there is more than one defensible ’correct’ answer. Both
options B and D may encourage guessing to the students and it requires a higher order
thinking skills.

As a result, difficult items tend to discriminate between those who does not know the answer. A
student who is naturally does not know the subject matter will unable to answer correctly as it
requires higher order thinking skills to solve the problem even if the question was easy. Hence, it is
important to construct item analysis in order to evaluate the quality of each item. Besides, it does
suggest ways of imroving the measurement of a test.
References

1. Books
Mathers N, Fox N. and Hunn A. Surveys and Questionnaires. The NIHR RDS for the East
Midlands / Yorkshire & the Humber, 2007.

2. Websites
Jones, T. L., Baxter, M. A. J., & Khanduja, V. (2013, January 1). A quick guide to survey research.
PubMed Central (PMC). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3964639/

Carol Ann Tomlinson, T. R. (2016). Assessment and Student Success in a Differentiated


Classroom. Retrieved from Learn, Teach, and Lead: https://www.fluentu.com/blog/educator-
english/esl-differentiation/

International Databe, a. U. (1999). Education of the Information Age. Retrieved from EdInformatics:
https://www.edinformatics.com/timss/TIMSS_PISA_test.htm

Tomlinson, S. &. (2011). National Research Council.

You might also like