Professional Documents
Culture Documents
pubs.acs.org/jchemeduc
■ INTRODUCTION
One important goal in science education is fostering students’
needed that can elicit individual students’ level of under-
standing.
understanding of core scientific concepts.1,2 However, science While interviews or open-ended items are considered to
education research has shown that students are struggling with provide a rich view of one student’s understanding, such
developing a deeper understanding of these concepts.3 methods do not efficiently provide an overview of students’
Recently, learning progressions have been suggested as means understanding in class. Although multiple-choice items are
to foster students’ progression in understanding core concepts.4 much more efficient for larger groups, they do not elicit
Learning progressions build on the idea that instruction and students’ understanding as well as interviews can. Recently,
assessment need to be aligned. At the core of a learning Briggs, Alonzo, Schwab, and Wilson7 suggested that ordered
progression is a sequence of levels reflecting an increasingly multiple-choice (OMC) items might offer deeper insights into
sophisticated understanding of the concept. As students’ students’ thinking while still being efficient to analyze. In an
pathways to scientific understanding are oftentimes not linear OMC item, each response option reflects a particular level of
and not the same for every student, this sequence is thought to understanding. In contrast to concept inventories, OMC items
describe an idealized pathway.5 And while students are not incorporate a developmental perspective. An instrument based
expected to all have the same level of understanding or progress on OMC items will not only provide information about which
in the same way, each level is considered to mark a major step alternative conceptions of a scientific concept students hold but
toward a deeper scientific understanding. Instructional also allow for determining where students are on their way
components specifically designed for each level of under- toward a deeper understanding of the respective concept.
standing are expected to help students progress to the next Students who consistently choose response options related to
level. However, in order to choose the proper instructional one level across a set of OMC items can be expected to have
component for their students (or individual groups of obtained that level of understanding. Thus, OMC items provide
students), teachers need to know about students’ current
level of understanding.6 Hence, assessment instruments are Published: November 5, 2013
© 2013 American Chemical Society and
Division of Chemical Education, Inc. 1602 dx.doi.org/10.1021/ed3006192 | J. Chem. Educ. 2013, 90, 1602−1608
Journal of Chemical Education Article
a more detailed picture of students’ understanding compared to age levels provides a good picture of how students progress in
multiple-choice items. Previous research suggests that OMC their understanding of the structure and composition of
items have the potential to assess students’ level of under- matter.17−21 We synthesized five hierarchically ordered levels
standing almost as well as interviews.8 of students’ understanding of the structure and composition of
The purpose of this paper is to detail our efforts in matter from a review of this research. These levels describe a
developing an instrument for assessing students’ understanding progression from an understanding of the macroscopic
of matter based on OMC items. The specific questions to be properties of matter toward an understanding of the particulate
answered by this paper are as follows: structure of matter and its relation to macroscopic properties.
1. How can a successful process of designing an instrument The lowest level, level 1, relates to an everyday conception of
using OMC items to assess students’ understanding of matter. At this level students do not use the idea of a particulate
matter be characterized? nature of matter when trying to explain phenomena relating to
2. To what extent does an instrument based on OMC items science. At the next level, level 2, students have learned about
provide reliable and valid information about the level of the particulate nature of matter. However, instead of using a
students’ understanding of matter? particle model, students’ will explain phenomena using a hybrid
■
model, in which particles are thought to be embedded in a
METHOD substance.17,22 At level 3, students are able to make use of a
simple particle model to explain phenomena. Not having
One approach to design assessment instruments has been learned about the substructure of these particles, they believe
suggested by Wilson.9 This approach describes four building that matter is built by particles as the smallest unit.18,23 At level
blocks in the process of designing assessment instrument: the 4, students can make use of a differentiated particle model, in
construct map, the items’ design, the outcome space, and the which each particle (i.e., atom) is made up of smaller particles
measurement model (see Figure 1). In the past, Wilson’s (i.e., protons, electrons, and neutrons).14,24 And at level 5,
students finally can explain macroscopic properties of matter as
a result of the properties of the particles the matter is made
from and the complex interactions between these particles.25,26
For a more detailed description of the individual levels, see the
Supporting Information. It is important to note that while
individual students may have a different level of understanding
and how they progress through this sequence of levels may
vary, the sequence represents a reasonable basis for assessing
students’ level of understanding of the structure and
composition of matter.
Items’ Design
The second building block in the process of developing is the
items’ design. This building block relates to specifying the type
of item used to elicit evidence about students’ level of
understanding of the construct with respect to the construct
Figure 1. The four building blocks.9
map.9 Recently, Briggs et al.7 suggested a novel type of items,
so-called ordered multiple-choice items that aim to facilitate
diagnostic assessment linking the response options in multiple-
approach has been successfully used to develop assessment choice items to different levels of understanding of a construct.
instruments for a variety of different constructs.7,8,10 We will In an OMC item, each response option is assigned one level of
describe each of the building blocks in greater detail and discuss understanding (Figure 2). Similar to regular multiple-choice
how to use them to design an instrument assessing students’
understanding of matter based on OMC items.
Construct Map
The first building block entails the construction of a construct
map. A construct map provides a substantive definition of the
construct and an ordering of qualitatively different levels of
increasing sophistication of the construct.10 That is, construct
map defines what students should understand and how
students progress in developing this understanding.11 In the
case of our instrument, what students should understand is the
concept of matter.
Students’ understanding of matter is well researched.12
Although different conceptions of what students should
understand about matter can be found in the literature,13−15
researchers agree that an understanding of the structure and
composition of matter is central to understanding the concept
of matter as a whole.16 And while there are only few
longitudinal studies on students’ progression,17,18 the extensive Figure 2. Sample ordered multiple-choice item. In order to recognize
research based on students’ conceptions of matter at different option A as the correct answer, an understanding at level 4 is required.
items, one response option is correct. This option reflects the item response theory (IRT) has proven to be fruitful. The IRT
highest level of understanding that can be tested with this item. approach we used to analyze our data set is commonly known
All other response options are incorrect. These options reflect as Rasch analysis.32 For a detailed description of how Rasch
(scientifically) incorrect ideas of students at lower levels of analysis can guide the process of developing measurement
understanding. instruments, see for example Liu and Boone.33 The first step in
To develop OMC items assessing students’ understanding of Rasch analysis is to define a scoring scheme. As all response
the structure and composition of matter, in our study we used options of an OMC item are linked to a specific level of
information from both the curriculum and research on understanding, each response option was assigned the value
students’ alternative conceptions. Based on the curriculum corresponding to the respective level of understanding, based
and respective textbooks, the item context and question were on the framework described (from 1 to 5). Finally, a
formulated.1,27 Once the question and correct response option polytomous Rasch model, more specifically the partial credit
were formulated, the response option was assigned a level of Rasch model (PCM), was applied to the data. Analyzing the
understanding. Subsequently, the response options for lower data set using the PCM can provide valuable information about
levels of understanding were formulated based on typical item functioning with respect to the construct map. To detail
alternative conceptions students on the respective levels of this point, the design and findings of a study in which an
understanding would typically have.28−30 On the one hand, the instrument based on OMC items has been used will be
aim was to develop response options meaningful to students. reported in the next two sections.
On the other hand, it should not be obvious to students which
answer is correct. A sample item is shown in Figure 3. As can be ■ DESIGN OF THE STUDY
In order to investigate the functioning of the instrument
assessing students’ understanding of the structure and
composition of matter based on OMC items, we administered
it to a sample of 294 students in grades 6−12 in a German
grammar school. The instrument included a set of 10 OMC
items. Because the school’s science curriculum would not allow
for students to develop an understanding on level 5 yet, only
items assessing an understanding up to level 4 were included in
the instrument. In order to investigate to what extent the OMC
items can be used to assess students’ understanding, open-
ended items corresponding to the OMC items were included in
the instrument. These items were basically open-ended versions
of the OMC items (same question, but no multiple-choice
response format).
The testing was done on a single day. It was up to the
Figure 3. An OMC item for the category of structure and teachers to decide whether a given class would participate in the
composition.31 Additional OMC items, as well as a more detailed study. For the classes that did participate, the instrument was
description of the levels of understanding, can be found in the administered to all students in those classes. (See Table 1 for
Supporting Information.
Table 1. Distribution of Students across Grades
seen from this item, it is not that all levels of understanding Grade Levels
have to be covered by response options.7 As a matter of fact,
the item shown in Figure 3 can only be used to differentiate Students 6 7 8 9 10 11 12
between the first three levels of understanding. The OMC Number 41 73 40 47 34 29 30
items were authored and compiled into an instrument by a
team of science educators and science teachers.
Outcome Space information about the distribution of participating students.)
Due to limited testing time, each student was presented with a
The third building block, outcome space, is about the
different version of the instrument. Each version included the
relationship between the construct map and the items.9 In
10 OMC items and three open-ended items. The different
the case of our OMC items, this means how the response
versions were designed so that each of the 10 open-ended items
options relate to the levels of understanding of the structure
was answered by the same number of students. To avoid an
and composition of matter. In order to investigate the validity
influence of OMC items on the open-ended items, the open-
of the assignment of levels to response options, the assignment
ended items were given first. Students were allowed to proceed
was carried out independently by three researchers. A Fleiss κ
to the OMC items only after completing the open-ended items
measure was used to obtain information about the agreement of
and were not allowed to go back to the open-ended items
the three researchers. A value of κ = 0.88 suggests the
afterward. To obtain additional information for item refine-
researchers agreed quite well in their assignment of levels of
ment, each student received a questionnaire designed
understanding to the response options.
specifically to collect such information for a subset of three
Measurement Model OMC items. This questionnaire included questions regarding,
In the fourth and last building block, the measurement model for example, words students would not understand, or their
(i.e., the relation between the scored outcomes and students’ reasoning for choosing a particular option.34 Again, we made
level of understanding as described in the construct map) is sure we would receive feedback for each item from about the
defined.9 For the process of developing assessment instruments, same number of students.
1604 dx.doi.org/10.1021/ed3006192 | J. Chem. Educ. 2013, 90, 1602−1608
Journal of Chemical Education
■
Article
a
N = 294.
■
Article
Third, although the levels linked to the response options of ASSOCIATED CONTENT
the OMC have a significant influence on item difficulty (see *
S Supporting Information
Figure 5), item difficulty still overlaps for some items from
different levels. That is, levels of understanding cannot Detailed description of the individual levels and of the OMC
necessarily be distinguished from each other. Therefore, items. This material is available via the Internet at http://pubs.
acs.org.
■
response options should be refined in order to better represent
the different levels of understanding. For example, in item 8
(Figure 4), the difference in difficulty between level 2 and level AUTHOR INFORMATION
3 needs to be adjusted. Response options that were not Corresponding Author
attractive to students could be revised. Conversely, response *E-mail: hadenfeldt@ipn.uni-kiel.de.
options that reflected quite common misconceptions could be
Notes
revised in order to make the right answer more attractive.
The authors declare no competing financial interest.
■
Although the items cover the latent trait well, more items need
to be developed. Particularly, more items are needed with
thresholds between 0.7 to 1.7 logits. As it is well-known that ACKNOWLEDGMENTS
students do not respond consistently to similar problems set in We would like to thank the journal reviewers and editors for
different contexts,6 more items need to be developed to get their detailed and helpful recommendations. The research
more information about the role of context with respect to the reported here was supported by the German Federal Ministry
structure and composition of matter. The clear progression (see of Education and Research.
Figure 5) might be due to the fact that most of the OMC items
focused on structure−property ideas. It would be of interest to
determine to what extent the progression in students’
■ REFERENCES
(1) National Research Council. National Science Education Standards;
understanding was due to items that focused on single ideas National Academy Press: Washington, DC, 1996.
(i.e., dynamics or structure itself). As the only misfitting item (2) Waddington, D.; Nentwig, P.; Schanze, S. Making It Comparable:
was the one that addressed dynamics of particles, it might be Standards in Science Education; Waxmann: Münster, Germany, 2007.
assumed that students might progress at different rates along (3) Vosniadou, S. International Handbook of Research on Conceptual
each idea. This would be another issue to be tackled in further Change; Lawrence Erlbaum: Mahwah, NJ, 2008.
studies. (4) Duschl, R. A.; Schweingruber, H. A.; Shouse, A. W. Taking science
The possibility of establishing a link between a proposed to school: Learning and teaching science in grades K-8; National Academy
Press: Washington, DC, 2007.
framework (i.e., a construct map) and students’ performance on (5) Neumann, K.; Viering, T.; Boone, W. J.; Fischer, H. E. J. Res. Sci.
the instrument (i.e., students’ scores) is one of the main Teach. 2013, 50, 162−188.
advantages of using Rasch analysis. Rasch analysis allowed us to (6) Pellegrino, J. W.; Chudowsky, N.; Glaser, R., Eds. Knowing What
link students’ performance to the difficulty of (different levels of Students Know: The Science and Design of Educational Assessment;
understanding in) OMC items. In the case of our instrument, it National Academies Press: Washington, DC, 2001.
seems that students performing well on the instrument have (7) Briggs, D. C.; Alonzo, A. C.; Schwab, C.; Wilson, M. Educ.
indeed reached a high level of understanding with respect to Assessment 2006, 11 (1), 33−63.
the underlying framework. Students with a lower achievement (8) Alonzo, A. C.; Steedle, J. T. Sci. Educ. 2009, 93 (3), 389−421.
were found to prefer response options assigned to lower levels (9) Wilson, M. Constructing Measures: An Item Response Modeling
Approach; Lawrence Erlbaum: Mahwah, NJ, 2005.
of understanding. Thus, the instrument enables differentiation
(10) Wilson, M. Res. Sci. Teach. 2009, 46 (6), 716−730.
between high- and low-performing students, as well as allowing (11) Wilson, M. Responding to a Challenge That Learning
interpretation of this performance in terms of qualitatively Progressions Pose To Measurement Practice. In Learning Progressions
different levels of understanding of the composition and in Science, Alonzo, A., Gotwals, A., Eds.; Sense Publishers: Rotterdam,
structure of matter. The possibility of this kind of interpretation 2012; pp 317−344.
makes the instrument interesting for further research and for (12) Duit, R. Bibliography for Students’ and Teachers’ Conceptions and
classroom practice. Teachers can use the instrument in two Science Education. http://www.ipn.uni-kiel.de/aktuell/stcse/stcse.html
important ways: (i) as a formative test to see which level of (accessed Oct 2013).
understanding students enter class with and to monitor student (13) Smith, C.; Wiser, M.; Anderson, C. W.; Krajcik, J.; Coppola, B.
learning throughout instruction; and (ii) as a tool for evaluation Measurement 2006, 14 (1, 2), 1−98.
(14) Stevens, S.; Delgado, C.; Krajcik, J. S. J. Res. Sci. Teach. 2010, 47
to assess student learning and give feedback to students about (6), 687−715.
their achievement. Analysis can focus on the distribution of (15) Liu, X.; Lesniak, K. J. Res. Sci. Teach. 2005, 43 (3), 320−347.
students’ responses to single items as well as students’ (16) Othman, J.; Treagust, D. F.; Chandrasegaran, A. L. Int. J. Sci.
performance on the entire test. While Rasch analysis needs Educ. 2008, 30 (11), 1531−1550.
specialized software (and an analysis with data from a single (17) Johnson, P. Int. J. Sci. Educ. 2002, 24 (10), 1037−1054.
classroom is not necessarily meaningful), once instrument (18) Löfgren, L.; Helldén, G. Int. J. Sci. Educ. 2009, 31 (12), 1631−
functioning has been fully established, teachers can simply 1655.
evaluate their students’ thinking by assigning them the level of (19) Liu, X. Int. J. Sci. Educ. 2001, 23 (1), 55−81.
understanding reflected by the chosen response option. Based (20) Krnel, D.; Watson, R.; Glazar, S. A. Int. J. Sci. Educ. 1998, 20 (3),
257−289.
on the analysis of both students’ performance on single items
(21) Anderson, B. Stud. Sci. Educ. 1990, 18, 53−85.
and the whole instrument, teachers may then design instruc- (22) Renström, L.; Andersson, B.; Marton, F. J. Educ. Psych. 1990, 82
tional components to specifically address their students’ needs. (3), 555−569.
In summary, instruments based on OMC items seem to offer (23) Pfundt, H. Chim. Didakt. 1981, 7, 75−94.
teachers a simple and efficient way to obtain a detailed picture (24) Harrison, A. G.; Treagust, D. F. Sci. Educ. 1996, 80 (5), 509−
about students’ understanding of core concepts. 534.