21 views

Uploaded by api-302131686

© All Rights Reserved

- Scoring & Grading
- Tutorial Letter 101
- EATC and QEAC (1)
- description: tags: twigg
- CIVL5668 2012 Semester 1 Student
- task 4
- Roberson-FRIT 7236 Assessment Plan
- Ppms Mathcomp e
- progress report sample successful
- literacy lesson- aed
- Determining Learning Outcomes & Exploring Assessment Option
- observationfeedback ed201
- objective 2 kearta dailey hdfs 421 2
- keenan2 acceleratedalgebrai syllabus
- souza evaluation and assessment final project
- vei-social action plan project
- examslearningoutcomes.pdf
- Lesson Plan 105 More Ncy and Stewart
- amykeith-standard3
- Classroom Assess Hand Out

You are on page 1of 12

Item Analysis: Coordinate Algebra Unit Three (Linear and Exponential Functions)

Valerie A. Brown

ITEM ANALYSIS

Item Analysis: Coordinate Algebra Unit Three (Linear and Exponential Functions)

Effective assessment is an important part of being a professional educator. The primary

purpose of assessment is to promote and document student learning (Stiggins, 2012). In order to

use assessment to effectively inform instruction and thus promote learning, one must confirm the

quality of the assessment relative to its ability to provide both reliable and valid information

concerning students mastery of targeted learning objectives, as well as consider characteristics

of the learners themselves. Item analysis is one tool educators employ to accomplish this task.

From a Classical Test Theory perspective, this involves the examination of three main

parameters: item difficulty, item discrimination, and distractor function (Samuelsen, 2015). This

paper will discuss the item analysis of a county mandated Coordinate Algebra unit test on linear

and exponential functions along with its corresponding table of specifications, each considered

mindful of the class composition.

Table of Specifications

As item analysis is a process designed to evaluate the intentional worth - in this case the

inferential value relative to the degree of students concept mastery - of individual test items, and

refine them with the purpose of elevating assessment performance, it is helpful to first consider

the scope and distribution of these individual elements collectively. A table of specifications is a

convenient format that allows educators to quickly and clearly determine the alignment of

assessment items with unit learning objectives at various taxonomy levels in order to ensure an

adequate sample of each (Stiggins, 2012). As can be seen in Table 1, this test has a relatively

even distribution of assessment items addressing all learning objectives at one or more levels of

cognitive demand. Of the seven specified standards, four are addressed by two items and three

by one item. Additionally, a total of three items address both the lowest and highest levels of

ITEM ANALYSIS

cognitive demand, while five items target the mid-range level. Specifically, items eleven,

thirteen, and seventeen address depth of knowledge (D.O.K.) level 1, items sixteen, eighteen, and

nineteen address D.O.K. level 3, and the rest of the items address D.O.K. level 2. It should be

noted that this table represents only selected response assessment items; ten other items

consisting of short answer and constructed response were included in the unit assessment that

further sample some of these areas (Hall County School District, 2015).

Since assessments provide information about specific learners, it is also appropriate to

consider some class demographics. This assessment was administered to an honors level class

consisting entirely of ninth grade students almost evenly distributed by gender with twelve girls

and thirteen boys. By the authors observation, the students are primarily Caucasian with only

two students of Hispanic descent, and there is some evidence of economic disparity (e.g. brand

name clothing and accessories), however, this too seems pretty evenly distributed across the

spectrum with one or two students representative of either extreme.

Item Analysis

Combining information gleaned from the table of specifications with the calculated item

analysis parameters in Table 2 provides deeper insight into the assessments functioning.

Item Difficulty

Effectively gauging distinct levels of student learning necessitates the inclusion of items

that vary according to the level of difficulty which for the purposes of this paper is defined as the

percentage of students answering correctly (Guidelines for Post Exam Review [Guidelines],

2007). Item difficulty (p) values for this assessment ranged from 0.52 to 1 with items sixteen,

eighteen, and nineteen returning the three lowest values. This is supported by the table

specifications in which those three problems were designated as level three D.O.K. and so would

ITEM ANALYSIS

be expected to be the most difficult problems. Similarly, items eleven, thirteen, and seventeen

were designated as level one D.O.K., and exhibit the three highest p values with the exception of

item twelve. D.O.K. level two items were fairly evenly distribute across the connecting range

with items fourteen and twenty presenting as slightly easier than the level three items, and items

fifteen and twenty-one marginally more difficul than the level one items. According to

Guidelines (2007) and Understanding Item Analysis Reports (2005), optimum levels of

difficulty in order to maximize discrimination range from about 0.63 to about 0.74 for a fourresponse selected response assessment. By this criteria only one of the eleven items clearly

qualifies, however, if one considers acceptable p values as ranging from 0.30 to 0.90 for a

criterion-referenced assessment in accordance with DCOM guidelines, all but two items qualify

(Samuelsen, 2015).

Item Discrimination

Difficulty level also pertains to the discriminatory ability of an item. Discriminatory

ability refers to the ability to differentiate between those students who exhibit greater

understanding of the content as evidenced by higher assessment score, and those with less

complete understanding evidenced by lower scores (Understanding Item Analysis Reports

[Understanding], 2005). Discrimination values (d) are calculated as the difference between the

probability of high-scoring examinees answering an item correctly and the probability of lowscoring examinees answering an item incorrectly (Samuelsen, 2015). An item that is alternately

too hard such that nearly all students either guess or miss it, or too easy such that nearly all

students get it right, will have a low discrimination value (d) (Understanding, 2005). Ideal d

values should be approximately 0.25 to 0.5 (Samuelsen, 2015). From Table 2 approximately

half of the items fall within this range, and most of the other half fall between 0 and 0.25. This is

ITEM ANALYSIS

significant as one expects generally lower d values for assessments measuring a wide range of

content; this assessment in its entirety addressed ten different standards (Understanding, 2005).

Alternatively, these low d values could suggest vague phrasing of the item and therefore warrant

further investigation (Understanding, 2005). Of particular concern is item 20 as a negative d

value should, in most cases, be eliminated as it suggests either a mis-keyed answer or, at

minimum, unlikely inferential validity concerning student learning (Guidelines, 2007;

Understanding, 2005).

Considered together, item difficulty and item discrimination can be compared with the

DCOM Suggested Guidelines for Reviewing and Eliminating Question Items to assess item

suitability (Samuelsen, 2015). Item eleven had a difficulty of 1.00, and therefore, a

discrimination value of 0.00. According to the DCOM guidelines, this item as well as item

twenty-one with difficulty 0.84 and discrimination 0.00 border on problematic items that should

be reviewed and refined before future inclusion on assessments (Samuelsen, 2015).

Additionally, item eighteen warrants review as it had very low discriminatory power although

56% of students answered correctly, and item twenty should be eliminated in its present form

because of suspect accuracy of learning measurement, despite nearly 70% of students answering

correctly (Guidelines, 2007). The remaining items all fall within acceptable limits for both

discrimination and difficulty although item twelve had low difficulty and item nineteen had

higher than ideal discrimination.

Distractor Frequency

Another important component of item analysis is distractor frequency. Distractor

frequency refers to the number of students choosing particular incorrect answers. In Table 2

those distractor responses identified as tapping into specific misconceptions are indicated by

ITEM ANALYSIS

asterisks (*) superscripted to the right of the number of responses. Misconceptions listed by

item number are as follows:

Item eleven Answers A, B, and C all tap into the misconception that x or y intercepts

equate to a solution of both equations. As no students answered incorrectly no misconceptions

were indicated by this item.

Item twelve Answer C taps the misconception that curved lines are not functions. Only

one students answered this incorrectly.

Item thirteen Answer D taps into a basic misunderstanding about functions notation.

Two students answered with this incorrect response confusing which value substitutes in for a

given variable.

Item fourteen Answers C and D tap into misconceptions about inputs (domains) and

outputs (ranges). These answers were both chosen by 5 student who alternately flipped inputs

and outputs, misinterpreting what quantity each referred to in the word problem. This is a

common problem for students.

Item fifteen All students who missed this problem incorrectly answered C indicating A

and D were ineffective distractors. As this item was a continuation off the same equation as item

fourteen, it is possible that most students guessed, but were aided by context clues in the item

that allowed them to narrow the choices to only the correct response or choice C.

Item sixteen Nearly half the class missed this item with answers A and D chosen as the

most common incorrect responses, each drawing five students, and two students choosing answer

B. These choices indicate the same misunderstanding as item fourteen about inputs and outputs

of functions, and along with the high number of incorrect responses to item fourteen suggest the

need to revisit this topic in future instruction.

ITEM ANALYSIS

Item seventeen All three students who answered incorrectly chose answer choice A

which indicated that they misunderstood constant versus variable rates of change. As is item

fifteen, the other two distractors were ineffective in diagnosing other misconceptions.

Item eighteen Answers B and C were chosen by five and six students respectively, and

tap into essentially the same misconception as item sixteen, accurate function evaluation for

given inputs to match a specified sequence of outputs.

Items nineteen and twenty Distractor choices for each of these items functioned by

drawing nearly the same number of incorrect responses each although they didnt target specific

misconceptions thus suggesting that students generally just didnt know how to solve the

problems.

Item twenty-one Four students missed this item with one each choosing B and D, and

two choosing C. Of concern though was the greater propensity of the high-scoring students to

incorrectly answer this item relative to the low-scoring students. Consequently, although 84% of

students answered correctly, this item may need to be revised.

Conclusions

In the course of conducting this item analysis much more specific information was

discerned than is evident from standard calculations such as measures of the central tendency of

grades. For instance, items that are too easy may bolster student confidence but offer little in

terms of describing disparate levels of learning, those that are answered correctly by more lowscoring students than high-scoring students signify significant issues and should be revised

before inclusion in future assessments, and misconceptions are sometimes illuminated by

particular distractor functioning which can be especially helpful to inform subsequent teaching.

ITEM ANALYSIS

While item analysis doesnt lend itself equally well to all assessment types and one must

remain mindful of the error inherent in any assessment, its future utility regarding selected

response assessments has been confirmed in the humble opinion of this author who, in

consideration of time constraints, will make it periodic if not routine practice in order to craft

quality assessments and inform teaching.

ITEM ANALYSIS

9

References

Guidelines for post exam review. (2007, November 7). [PDF]. Retrieved from from

http://www.lmunet.edu/public/uploads/dcom/pdfs/interpreting_item_analysis.pdf

Hall County School District. (2015). Coordinate algebra unit 3A: Linear and exponential

functions. Gainesville, GA: Hall County Schools

Kentucky Department of Education. (2007). Support materials for core content for assessment.

Retrieved from http://education.ky.gov/curriculum/docs/documents

/cca_dok_support_808_mathematics.pdf

Samuelsen, K. (2015). Item analysis and differential item function [PowerPoint slides].

Stiggins, R. J. (2012). An introduction to student-involved assessment for learning (6th ed.).

Upper Saddle River, N.J.: Pearson/Merrill Prentice Hall.

Understanding item analysis reports. (2005). [PDF]. Retrieved from

https://www.washington.edu/oea/services/scanning_scoring/scoring/item_analysis.html

ITEM ANALYSIS

10

Tables

Table 1

Test Specifications for Coordinate Algebra Unit Three: Linear and Exponential Functions

Standards

MGSE9-12.A.REI.11

D.O.K. 1

1(11)

D.O.K. 2

D.O.K. 3

Total

1

approximations, recognize the solution to

f(x)=g(x)

MGSE912.F.IF.1

1(12)

1(15)

element of the domain exactly one element of

the range. If f is a function, x is the input

(domain) and f(x) is the output (range).

MGSE912.F.IF.2

1(13)

domain inputs, and interpret statements that

use function notation in terms of a context.

MGSE9-12.F.IF.5

1(14)

1(16)

and the quantitative relationship it describes.

MGSE912.F.IF.6

Calculate and interpret the average rate of

1(17)

ITEM ANALYSIS

11

Estimate the rate of change from a graph.

MGSE912.F.BF.1

1(21)

1(18)

1(20)

1(19)

between two quantities.

MGSE912.F.BF.2

Write arithmetic and geometric sequences

both recursively and explicitly, and use them

to model situations. Connect arithmetic

sequences to linear functions and geometric

sequences to exponential functions.

Note: D.O.K refers to Webbs Depth of Knowledge scale. Levels increase in cognitive demand

from 1 to 3.

ITEM ANALYSIS

12

Table 2

Item Difficulty, Discrimination and Distractor Frequency

Responder

Q11 Q12

a

1

1

b

1

0

c

1

1

d

1

1

e

1

0

f

1

1

g

1

1

h

1

1

i

1

1

j

1

1

k

1

1

l

1

1

m

1

1

n

1

1

o

1

1

p

1

1

q

1

1

r

1

1

s

1

1

t

1

1

u

1

1

v

1

1

w

1

1

x

1

1

y

1

1

p

1.00 0.92

d

0.00 0.17

Distractor frequency

A

0*

92

B

0*

4

*

C

0

4*

D

100

0

Key

D

A

0

0

0

0

0

1

0

1

1

0

1

0

1

0

1

0

1

0

1

1

1

1

1

0

1

0

1

0

1

0

1

0

1

1

1

0

1

1

0

1

1

1

0

0

0

1

1

1

1

0

1

0

1

0

1

1

1

1

0

1

1

1

1

1

1

1

1

1

1

0

1

0

1

1

1

0

1

1

1

0

1

0

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

0.88 0.60 0.80 0.52

0.25 0.25 0.25 0.42

4

0

88

8*

C

60

0

20*

20*

A

0

80

20*

0

B

20*

8

52

20*

C

Q17 Q18

0

0

0

1

1

1

1

1

1

0

1

0

1

0

1

0

1

1

1

1

1

0

0

1

1

1

1

0

1

0

1

0

1

1

1

1

1

1

1

0

1

1

1

0

1

1

1

1

1

1

0.88 0.56

0.25 0.08

12*

0

88

0

C

0

20*

24*

56

D

Q19

0

0

0

0

0

0

0

1

0

0

1

1

0

0

1

1

0

1

1

1

1

1

1

1

1

0.56

0.67

12

16

16

56

D

Q20 Q21

1

1

1

1

0

1

0

1

1

1

1

0

1

1

1

1

0

1

1

0

1

1

0

1

1

1

1

1

1

1

1

0

0

1

1

1

0

1

1

1

1

1

1

1

1

0

0

1

1

1

0.68 0.84

-0.08 0.00

12

68

8

12

B

84

4

8

4

A

(easy). d = Discrimination, and ranges from -1 (more low performers (red) correct than high

performers (green)) to 1 (more high performers correct than low performers).

* Distractors, frequency expressed as a percentage.

- Scoring & GradingUploaded bymirfanamin
- Tutorial Letter 101Uploaded byCHADLEY
- EATC and QEAC (1)Uploaded bylaxmikatshankhi
- description: tags: twiggUploaded byanon-118336
- CIVL5668 2012 Semester 1 StudentUploaded bySuman Saha
- task 4Uploaded byapi-254791533
- Roberson-FRIT 7236 Assessment PlanUploaded byMarie R
- Ppms Mathcomp eUploaded byJeffrey Mak
- progress report sample successfulUploaded byapi-311925882
- literacy lesson- aedUploaded byapi-349671090
- Determining Learning Outcomes & Exploring Assessment OptionUploaded byHutomo Atman Maulana
- observationfeedback ed201Uploaded byapi-305785129
- objective 2 kearta dailey hdfs 421 2Uploaded byapi-385009191
- keenan2 acceleratedalgebrai syllabusUploaded byapi-162570032
- souza evaluation and assessment final projectUploaded byapi-240936355
- vei-social action plan projectUploaded byapi-275510930
- examslearningoutcomes.pdfUploaded bykarim
- Lesson Plan 105 More Ncy and StewartUploaded bytaylor morency
- amykeith-standard3Uploaded byapi-355471639
- Classroom Assess Hand OutUploaded byJanry Simanungkalit
- ashley komorowski final reflection 2Uploaded byapi-208449375
- personal philosophy of edUploaded byapi-272908224
- lesson plan packing health and hygieneUploaded byapi-330258825
- Types of TestsUploaded byJoana
- formativeUploaded byapi-239946171
- disposition std 8Uploaded byapi-245145824
- pre-assessment analysisUploaded byapi-300370440
- archie cox - unit 1 2 - media methods skills and production techniquesUploaded byapi-330025334
- psuteacherworksampletemplateUploaded byapi-384083823
- CNM5421 - BriefUploaded byRyan Grech

- content defenseUploaded byapi-302131686
- quiz 11 eoc review unit 1 and 2 statisticsUploaded byapi-302131686
- re-address 2-11Uploaded byapi-302131686
- edtpalessonplan valerie lesson4 workingdocUploaded byapi-302131686
- vbrownedtpa rubricUploaded byapi-302131686
- north hall sequencing table 1Uploaded byapi-302131686
- assessment plan for unit 3 trigUploaded byapi-302131686
- school and community context exploration 1Uploaded byapi-302131686
- oct5-9reflectionconnectionUploaded byapi-302131686
- a consideration of grading policyUploaded byapi-302131686

- Bacteriological and Physio Chemical Quality Assessment of Drinking WaterUploaded byEditor IJTSRD
- Emax Press Abutmsolutions Va e 641315Uploaded byFabio Weno
- AQA Biol B Unit 5 Jan03Uploaded byFathimaRimza
- ABO Blood Group System.pdfUploaded byPerry Sin
- Manual Hidrolavadora Db-383539 Mca. HotsyUploaded byLigreza
- Training mtrl CM 2002 Nueva.pdfUploaded byYoseth Jose Vasquez Parra
- Wiki Archlinux OrgUploaded byoniromante
- Cebu City PresentationUploaded bya313
- Y6 Unit Plan SpreadsheetsUploaded byVamshi Krishna
- Bone Marrow Transplant.docxUploaded byRea Ann Autor Lira
- Autoimmune Disease,Review NEJM.pdfUploaded byAsiah Jelita
- DOC-20180626-WA0018Uploaded bymanto
- Bpm Aris Part1Uploaded bysmallik3
- Victorian EraUploaded byAru Saberfang D'crestfallen
- Annexure H-V for FTLUploaded byGirish Sharma
- leroy Somer_generator_lsa 46.2Uploaded byKIM
- Answers 2Uploaded byAditi
- Comunicacion Siemens -Allen BradleyUploaded byallenbradley
- 3ME5568-Chapter 2_Lecture WINDUploaded byArmando Rodríguez Cevallos
- Sparking-Student-Creativity-Book-Study-Guide.docxUploaded byJose Gregorio Vargas
- Dyi - Tbhc Model by Santiago109Uploaded byGeek & Roll
- 1050Uploaded byValeska Rodríguez Ponce
- Hallinan dep 1Uploaded byMaddie O'Doul
- Session XXIVUploaded byTinesh Kumar
- special education handoutUploaded byapi-420112600
- P072-E2-02+SysmacCatalogue_2012Uploaded byFlashPT
- Have Got sentencesUploaded byRita Ildefonso Sepúlveda
- praxis scoresUploaded byapi-356391154
- Levapor -Company ProfileUploaded byAmit Christian
- 164874930 Flowing Gas Material BalanceUploaded byVladimir Priescu