You are on page 1of 11

Article

pubs.acs.org/jchemeduc

Development of the Flame Test Concept Inventory: Measuring


Student Thinking about Atomic Emission
Stacey Lowery Bretz* and Ana Vasquez Murata Mayo
Department of Chemistry and Biochemistry, Miami University, Oxford, Ohio 45056, United States

ABSTRACT: This study reports the development of a 19-item Flame Test


Concept Inventory, an assessment tool to measure students’ understanding of
atomic emission. Fifty-two students enrolled in secondary and postsecondary
chemistry courses were interviewed about atomic emission and explicitly asked to
explain flame test demonstrations and energy level diagrams. Analysis of students’
explanations offered insight into students’ alternative conceptions and was used to
design items and distractors for a 19-item Flame Test Concept Inventory about
atomic emission. Results from a pilot study with first-year university chemistry
and with upper-division chemistry students were analyzed to create a final version
of the inventory that was administered to both secondary students (N = 459) and
first-year university students (N = 100) who had completed formal instruction
and course assessment about atomic emission. Analysis of student responses indi-
cated the inventory generated valid and reliable data. Common alternative
conceptions about atomic emission that remain postinstruction and their prevalence
are discussed.
KEYWORDS: High School/Introductory Chemistry, First-Year Undergraduate/General, Upper-Division Undergraduate,
Analytical Chemistry, Chemical Education Research, Demonstrations, Misconceptions/Discrepant Events, Testing/Assessment,
Atomic Spectroscopy, Atomic Properties/Structure
FEATURE: Chemical Education Research

■ INTRODUCTION
Students can construct coherent understandings of phenomena
to measure the prevalence of alternative conceptions with larger
numbers of students.
Concept inventories have been reported in the literature for a
that do not match accepted scientific views. These alternative
variety of chemistry concepts including particulate nature
conceptions, if not challenged through instruction and assess-
of matter,9−11 bonding,12,13 light and heat,14−17 equili-
ment, can become integrated into students’ cognitive structures
brium,18,19 inorganic qualitative analysis,20 ionization energy,21
and may interfere with subsequent learning of a new concept.1 organic acids,22 phase changes and solutions,23,24 quantum
One way that instructors can elicit their students’ alternative chemistry,25 redox reactions,26 and biochemistry concepts.27,28
conceptions is by administering a concept inventory to char- A majority of these concept inventories consist of questions
acterize students’ understandings of a particular concept compared that ask students to interact with either symbolic or particulate
to expert thinking.2,3 representations of matter. None of these concept inventories,
When developing a concept inventory, one goal is to design however, asks students to interact with macroscopic representa-
an instrument that is easy to use, can be administered in a short tions of chemistry such as visual observations in a laboratory
period of time, and can accurately identify students’ alternative experiment or chemical demonstration. This paper reports the
conceptions by generating reliable and valid data.4 A variety of development of a concept inventory that requires students to
methods for developing concept inventories exist, ranging interact with macroscopic observations, namely, an inventory to
from the creation of questions, responses, and distractors being measure students’ understandings about atomic emission in the
drafted by expert mapping of the content domain5 to gen- context of flame test observations.
eration of items through clinical interviews with students.6 The flame test is a longstanding demonstration in chemistry
The National Research Council7 recommends the use of classrooms. From 1928 to 2015, the Journal has published
student interviews in the development of assessment instru- 32 different procedures for conducting flame test demon-
ments because interviews generate qualitative data that can be strations.29−60 Flame tests are a colorful, visually interesting
used to formulate attractive distractors written in the natural demonstration of the anchoring concept61−63 that matter
language of students, thus ensuring that the test elicits common
student ideas and reasoning difficulties.8 Such distractor-driven Received: August 25, 2017
multiple-choice tests combine the richness of qualitative research Revised: October 27, 2017
with the power of quantitative assessment, thereby creating a tool Published: November 22, 2017
© 2017 American Chemical Society and
Division of Chemical Education, Inc. 17 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

consists of atoms that have an internal structure that dictates Interview Sample
their properties often as part of the evidence that led to the The sample included students in advanced placement chemistry
development of the quantum model. Despite the showy nature (AP, N = 14) and secondary students in a first-year chemistry
of this demonstration, there is scarce evidence that it promotes course (SC, N = 12), with both groups enrolled at a large,
student learning: “students move on, graduate, and [when] suburban, public secondary school in the midwestern United
I cross paths with them, they remember that flame test that States. The university sample consisted of undergraduate
I did for them; they may not remember the chemistry, but students in first-year chemistry (FYC, N = 14) and under-
they remember the demo.”64 While limited research has been graduate students in upper-division chemistry (UD, N = 12),
reported regarding students’ understandings of atomic line with both groups enrolled in a large, public and predominantly
spectra,65 there are no reports published to date of investigations undergraduate institution in the midwestern United States.
into student thinking about flame tests and atomic emission.


The UD students were enrolled in either an instrumental
analysis or an analytical chemistry course. All data collection
THEORETICAL FRAMEWORKS was approved by the Institutional Review Board with regard
In order to conceptualize the interview guide to investigate to protection of the rights of human subjects with respect to
students’ understandings about flame tests and atomic informed consent and minimization of risk. Each student was
emission, two different theoretical frameworks were employed. assigned a pseudonym.
Meaningful Learning Interviews
Ausubel and Novak have written extensively about the To develop a “thick description”76 of students’ understandings
construct of meaningful learning which is the process where about atomic emission, students were interviewed individually
students make substantive connections between what they in an instructional laboratory setting so that flame tests could
need to learn and what they already know, i.e., their prior be conducted. Secondary student interviews lasted an average
knowledge.66,67 Meaningful learning stands in contrast to the of 60 min and were conducted at their school. University
process of rote memorization where information is stored student interviews lasted an average of 45 min. Interviews were
without forming connections to prior knowledge. Ye and Lewis both audio and video recorded in order to capture student
explained rote learning as involving tasks such as “a first-year interactions with the prompts described below.
chemistry student [being asked] to recall the color of a par- The interviews consisted of four phases. Phase I asked open-
ticular metal when it is put in a flame test. As the information ended questions about atomic emission, and students were
that is being recalled has no meaningful association with asked to create representations77 to explain how atoms release
existing content knowledge, the information must be learned energy. Students were asked to think aloud78 about the topic
through rote learning.”68 Ausubel and Novak’s construct of and were provided with a periodic table, a digital Livescribe
meaningful learning has informed several chemistry education Pulse Smartpen,79 and paper to draw if desired. Phase II
research studies including research leading to the develop- focused on flame test demonstrations, asking students to
ment of concept inventories regarding bonding representa- respond to predict−observe−explain80 questions regarding
tions,12 redox reactions,26 and challenges with learning organic what would happen to three chloride salts in a flame test,
chemistry.69,70 conducted one salt at a time. Of the 52 students interviewed, 48
Johnstone’s Domains reported being familiar with the flame test before it was
To guide our choices for investigating students’ knowledge conducted in the interview. The flame tests were used to
of atomic emission, the interview guide was constructed to prompt student thinking about atomic emission in the macro-
intentionally explore Johnstone’s domains. According to scopic domain of Johnstone’s triangle, and to elicit what
Johnstone, the chemistry to be learned by students falls across connections, if any, that the students identified between the
three domains: the macroscopic domain (the tangible or macroscopic domain and atomic structure. In Phase III,
visible), the particulate domain (the invisible), and the symbolic students were asked to explain the conventions and symbols
domain (formulas or representations).71 Johnstone has argued found in an energy level diagram for a hydrogen atom such as
that one reason that science is so hard to learn is that, in formal the meaning of n (principal quantum number), the horizontal
science classrooms, students are expected to simultaneously lines (energy levels), and the numbers (with a maximum of
understand the connections among all three domains whereas zero) indicating energy. The energy axis was labeled “Energy ×
they should learn about dyads between two of the domains 1020 (J/atom)”. Phase III explored the students’ understanding
rather than have all three inflicted upon them at once.72 of the symbolic domain and its connections to atomic structure
Therefore, the interview guide in this research study that was with regard to electronic transitions. Lastly, in Phase IV, two
constructed to purposefully explore the connections (or lack additional energy level diagrams were shown to students: one
thereof) in students’ understandings about atomic emission was with an arrow pointing from n = 2 to n = 4 and the other with
designed using prompts from two domains, namely, energy an arrow pointing from n = 4 to n = 2. The students were asked
level diagrams (symbolic) and flame tests (macroscopic). to choose whether one, both, or neither of these diagram(s)


might represent the flame tests they had carried out in Phase II
METHODS of the interview. The intent of Phase IV was to elicit con-
ceptions related to connections between the symbolic and the
This research employed a mixed-method, sequential design.73
macroscopic domains.
Semistructured interviews74,75 were conducted to elicit student
thinking about atomic emission. These findings were then used Data Treatment
to create items and distractors for a concept inventory that All interviews were transcribed verbatim using both the audio
was then used to quantify the prevalence of these under- and video recordings. Data management was facilitated using
standings with a larger sample of students. Details on each of QSR NVivo8.81 The interview transcripts were analyzed
these procedures are provided below. for alternative conceptions using the Constant Comparison
18 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

Method (CCM) which is an inductive data coding process Expert Content Validation
used to categorize and compare qualitative data for analysis A preliminary version of the FTCI was a paper-and-pencil
purposes.82 In CCM, a unit of data (i.e., a transcript interview) inventory of 18 multiple-choice items. Each copy of the FTCI
is analyzed and broken into codes based on emerging themes, included a handout with color pictures of three chloride salts
which are then organized into categories that reflect a logical and their respective flame tests. Seven faculty members at
understanding of the codes. Then, this analyzed unit of data is the researchers’ institution who were experienced first-year
systematically compared with another unit of data. Ideally, this chemistry and/or analytical chemistry instructors were asked to
iterative process goes on until theoretical saturation is reached, review the chemistry content of the FTCI for accuracy.
which occurs when no new categories or new information Specifically, the faculty were asked to answer these three
emerge from subsequent interviews. In this study, a purposeful questions:
sample was used to include a diverse range of student expertise, 1. Which item(s), if any, do you find confusing? Why?
secondary and postsecondary chemistry students with the goal 2. Which item(s) best represent atomic emission? Why?
to reach theoretical saturation. 3. Which topic(s) were omitted that you feel need to be
A detailed analysis of students’ explanations has been included to best represent atomic emission? Why?
reported.83 The alternative conceptions that emerged through
the analysis of student interviews formed the basis for creating The faculty provided several comments and suggestions to
questions and distractors for the flame test concept inventory improve this preliminary version of the FTCI. For example, one
(FTCI) in the form of multiple-choice items. For example, expert suggested that an item that included a student-generated
Question 8 on the FTCI is shown in Box 1, and the correct Bohr atomic model to represent a multielectron substance
answer is indicated with an asterisk. be followed by an item that explicitly asked students about the
limitations of that student-generated Bohr model. The item
Box 1. Question 8 on the Flame Test Concept Inventory containing a Bohr atomic model was not deleted because that
model was the one most commonly drawn by secondary
Q8. Copper(II) chloride releases energy in a flame test when... students in their explanations of how atoms release energy.
A. losing valence electrons The expert was agreeable to retaining the item with the model
B. gaining valence electrons as long as an additional item was added to explore students’
C. breaking bonds in the compound comprehension of the limitations of that model for multi-
*D. valence electrons return to the ground state electron atoms. Other experts suggested changes in wording for
some items to more accurately reflect the chemistry that takes
place in flame tests. For example, rather than talk about the ions
All the distractors on the FTCI were inspired by multiple
in the flame, the word “ions” was changed to “atoms” because
explanations offered by students during the clinical interviews.
atomization more accurately describes the current scientific
For example, responses A, B, and C in Question 8 were crafted
understanding of the multiple processes taking place in a flame.
from students’ explanations about losing/gaining electrons and
A careful analysis of expert suggestions, balanced against the
breaking bonds during the flame test:
need to keep the essence of the alternative conceptions found
“In the flame test the valence electrons on the substances were
in the interviews, led to revisions in wording and the addition of
lost to make them more stable that’s why the color of the
two new items, resulting in the pilot-test version of the FTCI
flame stopped and when back to normal (referring to initial
with 20 items.
color) after a while, so this chart (energy level diagram with
arrow pointing from n = 4 to n = 2) shows that after a while, Concept Inventory Administration
after the flame test loses electrons, it became more stable, the The 20-item version of the FTCI was pilot tested both with
energy decreases.” (Alex, SC) FYC students (N = 222) and with UD students (N = 40)
“When atoms lose energy, (pause), when is releasing energy, enrolled in analytical chemistry courses. Seven of these students
one of them is gaining an electron and one of them is losing (5 FYC, 2 UD) subsequently participated in individual student
an electron (referring to arrows in energy level diagrams), validation interviews (described below), resulting in the
just trying to figure out [pause], to go up in energy level deletion of two items, the addition of one item, and revisions
(indicates energy level diagram with arrow going from n = 2 in wording to others. The revised, final version of the FTCI that
to n = 4), you have to gain an electron, but this is going to consists of 19 items was then administered to secondary high
coincide with gaining an electron and this one (points school chemistry students (SC, N = 308 and AP, N = 151)
at energy level diagram with arrow pointing from n = 4 to from 15 schools in 9 states across the United States, and to an
n = 2) is going to be losing an electron.” (Rachel, UD) additional FYC class (N = 100) giving consent for their results
“When the copper chloride burns (during flame test) there to be used in our research. Three weeks later, three students
are bonds breaking, one way the energy’s being released is in from the FYC class of N = 100 participated in validation
that certain wavelength of light that it is being emitted.” interviews; no additional modifications were deemed necessary.
(Arthur, AP) The 19-item FTCI was then administered for a second time in
“Energy is stored in the bonds between atoms, not in the this same classroom in order to conduct a test−retest reliability
actual atoms itself, and when you break the bonds that’s study. After omitting the responses from the three students
when energy is released... The energy release by the bonds who participated in the validation interviews, the responses of
breaking through the heat (during flame test), and N = 80 students were used for a test−retest reliability study.
depending in what wavelength is the visible light spectrum, In all cases, directions were given to students by their instruc-
wherever that lands, that’s the color you see.” (Waldo, AP) tors regarding voluntary participation and student consent for
“The flame (in flame test) is required to have the reaction the use of results in research. Students required 10−15 min to
take place and to break the bonds apart and to make the respond to the FTCI, and the FTCI was administered after
electrons jump.” (Michael, FYC) students had been formally assessed by their instructors on the
19 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

topic of atomic emission in order to identify the most firmly Table 1. Student Scores for the 20-Item Pilot-Test Version
held alternative conceptions. Excel and SPSS84 were used to and the Final 19-Item Version of the FTCI
analyze the data.
Version Scoring (Students) N Mean SD Median Min Max
Student Validation Interviews
20 items 1-tier (FYC) 222 6.61 2.770 6 1 17
The student validation interviews offered insights as to how to 1-tier (UD analytical) 25 12.56 3.150 13 7 18
further improve the quality of the data generated by the FTCI 1-tier (UD instrumental) 19 12.31 3.130 13 6 18
by identifying problems or ambiguities in the content and 19 items 1-tier (SC) 308 6.01 2.910 6 1 15
clarity of items that could lead to confusion for students.85 2-tier (SC) 4.29 2.344 4 0 11
Students who reported being unfamiliar with the flame test 1-tier (AP) 151 8.87 5.008 7 0 18
agreed that the FTCI handout with color pictures of the flame 2-tier (AP) 6.79 3.888 5 0 14
tests was clear and self-explanatory. The first round of vali- 1-tier (FYC) 100 8.48 4.011 8 1 18
dation interviews, which each lasted approximately 45 min, 2-tier (FYC) 6.09 3.220 6 0 14
were conducted 3 weeks after the administration of the 20-item 19 items 1-tier test (FYC) 80 8.53 4.081 8 1 18
FTCI. This waiting time was chosen because it has been shown test−retest
that explicit memory does not affect responses in a test−retest 1-tier retest (FYC) 8.68 3.871 9 1 17
2-tier test (FYC) 6.08 3.352 6 0 14
situation if it is obtained at a 3 week interval.86 The validation
2-tier retest (FYC) 6.18 2.647 6 0 12
interviews followed a think-aloud protocol where students
reanswered all items, providing information about how well
they understood the items, and which strategies (elimination, p-values exceeding 0.05 indicate normality. Only the FYC
guessing) they used to choose an answer. If a student was not scores were normally distributed, as shown in Table 2. One
guessing and chose a distractor, this indicated the distractor unusual feature of the AP scores in Figure 2 is the prominent
was appealing and/or rooted in the alternative conceptions number of students who scored 17/19. The AP data set
identified in the four-phase flame test interviews.86 includes responses from students with 5 different teachers in

■ RESULTS AND DISCUSSION


Concurrent Validity
4 states, but it would be incorrect to presume that the peak at
17 suggests there were only two (or three) questions that were
particularly difficult for the AP students. In fact, there were
six questions (nearly one-third of the items) on the FTCI
The scores for the pilot test of the 20-item FTCI for both FYC for which 30% or more of the AP students chose incorrect
(N = 222) and UD (N = 25 and N = 19) students can be found distractors. The discriminatory power of the entire inventory
in Table 1. Concurrent validity was examined to determine if was measured by Ferguson’s delta (δ) where δ ≥ 0.9 indicates a
the 20-item version of the FTCI could successfully distinguish broad score distribution and good discrimination among all
between populations.87 The FYC mean (M = 6.61; SD = 2.77) students.90 Both one-tier and two-tier scorings of the final
was significantly lower than the mean for either class of UD version of FTCI yielded satisfactory results for all students
students. The more experienced UD chemistry students scored (Table 2).
higher than the less expert FYC students. Two, one-tailed Internal Consistency
t-tests established that the two UD classes were equivalent,88
resulting in the subsequent merger of both courses’ data for The internal consistencies of SC, AP, and FYC scores on the
analysis (n = 44; M = 11.5; SD = 2.83). A t-test indicated that final 19-item version of the FTCI were measured by calculating
the UD pilot test FTCI scores were significantly higher than the a Cronbach α where α ≥ 0.7 values are considered acceptable
FYC scores [(t (264) = −11.50, p = 0.01 with η square (η2 = and, thus, indicate that all items closely measure the same
0.33) showing a large effect accounting for 33% of the variability construct.91 Kuder−Richardson (KR-20) indices were also
between groups], thereby providing a form of concurrent calculated as a measure of internal consistency to examine the
validity. It bears noting that, even though the UD students dichotomous scoring, either right or wrong, again with an
outperformed the FYC students by a statistically significant accepted criterion value of 0.7 or greater. Both one-tier and
amount, the average UD student answered only 11.5 out of two-tier FTCI scores yielded acceptable α values and KR-20
20 (57.5%) items on the 20-item FTCI. indices for both the AP and the FYC students (Table 3),
indicating that all items closely measured the same con-
Descriptive Statistics struct.90 The scores of the less experienced SC students,
The final version of the FTCI consists of 19 items, including however, did not exceed either threshold, reflecting a more
four, two-tier answer−reason pairs.5 Student responses were fragmented knowledge about the concepts assessed by the
scored both as a one-tier FTCI with 19 points possible (“1” for FTCI.92
correct, “0” for correct) and again as a two-tier FTCI with 15 Internal consistency was also examined by conducting a
points possible (“1” for correct answers to both the answer and test−retest analysis. Scores were available for N = 80 of the
the reason tiers, “0” for all other responses). Table 1 and FYC students (see Table 1) and used to calculate a stability
Figures 1−3 summarize the final version scores for secondary coefficient, as a measure of consistency in responses using
students (SC), secondary students enrolled in an AP chemistry one-tier scoring. While the stability coefficient as measured by
course (AP), and FYC students, including test−retest data. the Pearson correlation resulted in a strong correlation of 0.591,
All SC, AP, and FYC score distributions were skewed to the it did not meet the recommended threshold of 0.7. A lower
left, that is, score below the theoretical median. correlation, however, is not cause for concern as the appro-
The Kolmogorov−Smirnov (K−S) test was used to assess priateness of internal consistency thresholds for CIs and
the normality of the distributions in Figures 1−3 as the student alternative conceptions have been recently questioned given
sample sizes were above 50.89 In the K−S test, the scores of the challenges associated with measuring incomplete and
interest are compared to a normal distribution and, therefore, incorrect student understandings.3,22,92 Examining the correlation
20 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

Figure 1. FTCI one-tier and two-tier score distributions for SC (N = 308) students.

Figure 2. FTCI one-tier and two-tier score distributions for AP (N = 151) students.

Figure 3. FTCI one-tier and two-tier score distributions for FYC (N = 100) students.

between scores in a test−retest design does not account for consistency among students’ responses is likely to be lower.
the fact that students with identical scores may not, in fact, Most importantly, Streiner93 has cautioned against stringent
hold the same misconceptions. While internal consistency can application of 0.7 as a threshold given that concept inven-
be artificially inflated by asking questions that iterate on the tories are not unidimensional. The FTCI measures students’
same concept, when instruments are designed to detect knowledge of flame tests, but the constructs required for
misconceptions of students that result from fragments of understanding here include electronic structure of the atom and
knowledge and incorrect relationships among concepts, the properties of light.
21 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

Table 2. Results for Test of Normality and Discrimination A high p-value (p > 0.8) may indicate that the item was too easy
for FTCI Final Version Scores for the test’s intended population and may not be appropriate
for inclusion in an inventory designed to elicit students’ alter-
Scoring (Student K−S
Categories) Statistic df p-Value Ferguson’s δ native conceptions. Note that a low p-value does not necessarily
1-tier (SC) 0.121 308 0.000 0.94
indicate a malfunctioning item. A valid item could be answered
2-tier (SC) 0.144 308 0.000 0.92
incorrectly by a large number of students because of the very
1-tier (AP) 0.170 151 0.000 0.95
fact that it addresses a deep-rooted alternative conception.
2-tier (AP) 0.181 151 0.000 1.00
Item discrimination was calculated to determine how well
1-tier (FYC) 0.088 100 0.052a 1.00
each item differentiated between students who scored in the
2-tier (FYC) 0.920 100 0.035a 0.95
top 27% and those whose scores were in the bottom 27% of all
a scores.94 Discrimination index values of D > 0.3 are considered
These p-values were found to be normally distributed.
ideal.90 A low item discrimination index can be measured when
an item is either too difficult or too easy because either extreme
Table 3. Internal Consistency for Final Version FTCI Scores of difficulty corresponds to having all students getting an item
Internal Consistency Measures either correct or incorrect.
Student Categories Scoring Cronbach’s α KR-20
Discrimination index versus item difficulty plots for SC, AP,
and FYC students can be found in Figure 4. The majority of the
SC 1-tier 0.55 0.55
items functioned acceptably for all students. A few items were
2-tier 0.50 0.50
answered correctly by fewer than 25% of the SC students,
AP 1-tier 0.87 0.87
indicating that the items were particularly difficult for those
2-tier 0.83 0.83
students and that students either resorted to guessing (25%
FYC 1-tier 0.76 0.77
marks the guessing threshold for a 4-distractor item) or found
2-tier 0.73 0.74
another distractor particularly attractive. Note that the most
difficult items also poorly discriminated, given that students in
Item Analysis both the top and bottom 27% found them difficult.
Classical test theory was used to evaluate item difficulty, Individual item reliability in the form of a point biserial (ρbis)
discrimination, and reliability. Item difficulty, p, is defined was calculated as the correlation between each item’s score
as the proportion of students answering that item correctly. (correct = “1” or incorrect = “0”) and the overall test score.
The acceptable range of item difficulty is 0.30 < p < 0.80.90 Satisfactory values for ρbis equal or exceed 0.2.90 The majority

Figure 4. Discrimination index versus item difficulty with one-tier scoring. (The number next to each dot indicates the FTCI item.)

22 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

of the items functioned acceptably for all students. Notably, and 12 (category a). Figure 5 shows that 94% of SC students
items 9 and 10 (Box 2; correct answers are marked with an held at least six common alternative conceptions about atomic
asterisk), which asked students about representing absorbance emission. It bears repeating that these data were collected af ter
and emission using a Bohr atomic model and about the students had completed instruction and been tested by their
limitations of doing so, both had low ρbis values for both SC instructor on these ideas.
and AP students, indicating the strong appeal of distractors in Only 4% of AP students correctly answered items 9, 10, 11,
both of those items. The ρbis was low for FYC students for item and 12 (category a), while 45% of students correctly answered
10, which is the item that was strongly recommended during both items 4 and 15 (category b). Distractors in category a
expert content validation. Student validation interviews con- were the most commonly selected by AP students, while
firmed that students were not explicitly guessing the answer, distractors in categories b and f were the least selected by these
that they understood the intention of items 6 and 10, but that students. Figure 5 shows that 75% of AP students held at least
they were still strongly attracted to distractors. four common alternative conceptions about atomic emission.
Alternative Conceptions as Measured by the FTCI Only 4% of FYC students correctly answered items 9, 10, 11,
The FTCI items elicited alternative conceptions about atomic and 12 (category a), while 46% of students correctly answered
emission. Each FTCI distractor is directly tied to an alternative both items 4 and 15 (category b). Distractors in category a
conception revealed through the interviews and falls into one of were the most commonly selected by FYC students, while
these seven categories provided in Table 4. distractors in categories b and f were the least selected by these
Not one SC student correctly answered both items 4 and 15 students. Figure 5 shows that 85% of FYC students held at least
(category b), and only 8% correctly answered items 9, 10, 11, four common alternative conceptions about atomic emission.
The distractors can be further categorized according to the
Table 4. FTCI Items as They Correspond to Alternative students’ difficulties with connecting Johnstone’s domains
Conceptions about Atomic Emission (Table 5). For example, the “macroscopic/particulate” refers
to the difficulties that students had with connections between
Corresponding FTCI the macroscopic and particulate domains. Students’ alternative
Alternative Conception Categories Item(s)
conceptions regarding the connections between the macro-
a Misrepresentations of atomic emission 9, 10, 11, and 12 scopic and particulate domains are evidenced by their selection
b Atomic properties affecting atomic absorption 4, 15
and emission of distractors across five different FTCI items (Table 5).
c Breaking and/or forming bonds affect 3, 5 Students’ alternative conceptions at the “macroscopic/symbolic”
absorption and emission interface are evident by the popularity of distractors in Table 5
d Losing and/or gaining electrons 2, 6, 8 regarding the colors of the flame test and their connection to
e Process related alternative conceptions 7, 13, and 14 energy level diagrams. It was particularly challenging for students
f Heuristics 16 and 17; 18 and 19 to connect the energy level diagrams to a color such as “red”.
g Terminology used out of context 1 During the validation interviews, students were focusing on the
23 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

Figure 5. Students holding one or more alternative conceptions about atomic emission in the context of a flame test.

Table 5. Comparison of Students Choosing FTCI Distractors about Atomic Emission as They Correspond to Johnstone’s
Domains

direction of arrows and the terms “absorption” and “emission”, observed. Their confusion about the relationship between the
rather than explaining how the size of the gap between energy symbolic/particulate domains was also revealed by their com-
levels would result in the generation of different colors being bined responses to a two-tier question, where the first tier asked
24 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

them to choose a symbolic representation of emission and the Assessment in Science Teaching and Learning Symposium; The University
second tier asked them to choose a reason for their symbolic of Sydney; UniServe Science: Sydney, Australia, 2006; pp 1−9.
selection. (2) Libarkin, J. C. Concept Inventories in Higher Education Science.


Manuscript prepared for the National Research Council Promising
CONCLUSIONS AND IMPLICATIONS FOR Practices in Undergraduate STEM Education Workshop 2, Wash-
ington, DC, Oct 13−14, 2008. http://sites.nationalacademies.org/cs/
TEACHING groups/dbassesite/documents/webpage/dbasse_072624.pdf (accessed
The flame test concept inventory generates reliable and valid Oct 2017).
data for instructors’ use in assessing their students’ alternative (3) Adams, W. K.; Wieman, C. E. Development and Validation of
conceptions about atomic emission. The FTCI is an easy to Instruments To Measure Learning of Expert-Like Thinking. Int. J. Sci.
administer CI and requires only 10−15 min, but if an instructor Educ. 2011, 33 (9), 1289−1312.
has limited class time, s/he may select individual items for (4) Krause, S.; Birk, J.; Bauer, R.; Jenkins, B.; Pavelich, M. J.
in-class discussion or for use as a formative assessment in Development, Testing, and Application of a Chemistry Concept
Inventory. Paper presented at the 34th ASEE/IEEE Frontiers in
classroom response systems, for example, “clicker” questions.
Education Conference, Savannah, GA, Oct 20−23, 2004. Institute of
The flame test is a colorful demonstration that easily captures Electrical and Electronics Engineers; Piscataway, NJ, 2004. DOI:
the attention of students. However, the results from student 10.1109/FIE.2004.1408473. http://ieeexplore.ieee.org/stamp/stamp.
interviews and responses to the FTCI suggest that there is a jsp?arnumber=1408473 (accessed Oct 2017).
large gap between the positive affective response garnered by (5) Treagust, D. F. Development and Use of Diagnostic Tests To
the flame test and the cognitive understanding of what takes Evaluate Students’ Misconceptions in Science. Int. J. Sci. Educ. 1988,
place and what these observations suggest to chemists about 10, 159−169.
atomic structure. Instructors could engage students in their (6) Bretz, S. L. Designing Assessment Tools To Measure Students’
classroom using the flame test demonstration accompanying Conceptual Knowledge of Chemistry. In Tools of Chemistry Education
the “predict, observe, and explain” tasks used in the interview Research; Bunce, D. M., Cole, R. S., Eds.; American Chemical Society:
protocol and have a class discussion. Washington, DC, 2014; pp 155−168.
(7) National Research Council. Knowing What Students Know: The
Instructors who teach upper-division courses may wish to
Science and Design of Educational Assessment; The National Academies
formatively assess what prior knowledge their students bring Press: Washington, DC, 2001; pp 1−14. DOI: 10.17226/10019.
with them as residual from earlier coursework in chemistry. https://www.nap.edu/catalog/10019/knowing-what-students-know-
The development of FTCI shed light onto alternative con- the-science-and-design-of-educational (accessed Oct 2017).
ceptions related to challenges with understanding representa- (8) Sadler, P. M. Psychometric Models of Student Conceptions in
tions of the Bohr atomic model and energy level diagrams. Science: Reconciling Qualitative Studies and Distractor-Driven
Interviews with secondary students indicated their preference Assessment Instruments. J. Res. Sci. Teach. 1998, 35 (3), 265−296.
for the Bohr atomic model to explain how atoms release energy. (9) Mulford, D.; Robinson, W. An Inventory for Alternate
The FTCI includes both of these representations because Conceptions among First-Semester General Chemistry Students. J.
students bring these preferences to their first-year chemistry class- Chem. Educ. 2002, 79 (6), 739−744.
rooms. Instructors can assess understanding of the limitations to (10) Othman, J.; Treagust, D. F.; Chandrasegaran, A. L. An
Investigation into the Relationship between Students’ Conceptions
these models and start classroom discussions. The FTCI may also
of the Particulate Nature of Matter and Their Understanding of
be administered in a classroom after formal instruction, similar to Chemical Bonding. Int. J. Sci. Educ. 2008, 30 (11), 1531−1550.
how the data were collected in the current study. (11) Nyachwaya, J. M.; Mohamed, A.-R.; Roehrig, G. H.; Wood, N.
Colleagues interested in obtaining a copy of the FTCI B.; Kern, A. L.; Schneider, J. L. The Development of an Open−Ended
(including the color handout of the flame tests) for classroom Drawing Tool: An Alternative Diagnostic Tool for Assessing Students’
use or additional research should contact the corresponding Understanding of the Particulate Nature of Matter. Chem. Educ. Res.
author. Pract. 2011, 12, 121−132.

■ AUTHOR INFORMATION
Corresponding Author
(12) Luxford, C. J.; Bretz, S. L. Development of the Bonding
Representations Inventory To Identify Student Misconceptions about
Covalent and Ionic Bonding Representations. J. Chem. Educ. 2014, 91
(3), 312−320.
*E-mail: bretzsl@miamioh.edu. (13) Peterson, R. F.; Treagust, D. F.; Garnett, P. Identification of
ORCID Secondary Students’ Misconceptions of Covalent Bonding and
Structure Concepts Using a Diagnostic Instrument. Res. Sci. Educ.
Stacey Lowery Bretz: 0000-0001-5503-8987 1986, 16, 40−48.
Notes (14) Artdej, R.; Ratanaroutai, T.; Coll, R. K.; Thongpanchang, T.
Thai Grade 11 Students’ Alternative Conceptions for Acid−Base
The authors declare no competing financial interest.


Chemistry. Res. Sci. Technol. Educ. 2010, 28 (2), 167−183.
(15) Chandrasegaran, A. L.; Treagust, D. F.; Mocerino, M. The
ACKNOWLEDGMENTS Development of a Two-Tier Multiple-Choice Diagnostic Instrument
This material is based upon work supported by the National for Evaluating Secondary School Students’ Ability To Describe and
Science Foundation under Grant 0733642. Any opinions, Explain Chemical Reactions Using Multiple Levels of Representation.
findings, and conclusions or recommendations expressed in this Chem. Educ. Res. Pract. 2007, 8, 293−307.
material are those of the author(s) and do not necessarily (16) Linke, R. D.; Venz, M. I. Misconceptions in Physical Science
among Non-Science Background Students: II. Res. Sci. Educ. 1979, 9,
reflect the views of the National Science Foundation. We thank
103−109.
the students and instructors who made this study possible.


(17) Wren, D.; Barbera, J. Gathering Evidence for Validity during the
Design, Development, and Qualitative Evaluation of Thermochemistry
REFERENCES Concept Inventory Items. J. Chem. Educ. 2013, 90 (12), 1590−1601.
(1) Treagust, D. F. Diagnostic Assessment in Science as a Means to (18) Banerjee, A. C. Misconceptions of Students and Teachers in
Improving Teaching, Learning and Retention. In Proceedings of the Chemical Equilibrium. Int. J. Sci. Educ. 1991, 13 (4), 487−494.

25 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

(19) Voska, K. W.; Heikkinen, H. W. Identification and Analysis of (44) Ragsdale, R. O.; Driscoll, J. A. Rediscovering the Wheel: The
Student Conceptions Used To Solve Chemical Equilibrium Problems. Flame Test Revisited. J. Chem. Educ. 1992, 69 (10), 828−829.
J. Res. Sci. Teach. 2000, 37 (2), 160−176. (45) Thomas, N. C.; Brown, R. A Spectacular Demonstration of
(20) Tan, K. C. D.; Goh, N. K.; Chia, L. S.; Treagust, D. F. Flame Tests. J. Chem. Educ. 1992, 69 (4), 326−327.
Development and Application of a Two-Tier Multiple-Choice (46) McRae, R. A.; Jones, R. F. An Inexpensive Flame Test
Diagnostic Instrument To Assess High School Students’ Under- Technique. J. Chem. Educ. 1994, 71 (1), 68.
standing of Inorganic Chemistry Qualitative Analysis. J. Res. Sci. Teach. (47) Li, J.; Peng, A.-Z. Multiple Burning Heaps Of Color − An
2002, 39 (4), 283−301. Elegant Variation of a Flame Test. J. Chem. Educ. 1995, 72 (9), 828.
(21) Tan, K.-C. D.; Taber, K. S.; Goh, N.-K.; Chia, L.-S. The (48) Dalby, D. K. Bigger and Brighter Flame Tests. J. Chem. Educ.
Ionization Energy Diagnostic Instrument: A Two-Tier Multiple- 1996, 73 (1), 80−81.
Choice Instrument To Determine High School Students’ Under- (49) Bare, W. D.; Bradley, T.; Pulliam, E. An Improved Method for
standing of Ionization Energy. Chem. Educ. Res. Pract. 2005, 6, 180− Students’ Flame Tests in Qualitative Analysis. J. Chem. Educ. 1998, 75
197. (4), 459.
(22) McClary, L. M.; Bretz, S. L. Development and Assessment of a (50) McKelvy, G. M. Flame Tests That Are Portable, Storable, and
Diagnostic Tool To Identify Organic Chemistry Students’ Alternative Easy To Use. J. Chem. Educ. 1998, 75 (1), 55.
Conceptions Related to Acid Strength. Int. J. Sci. Educ. 2012, 34 (15), (51) Dragojlovic, V.; Jones, R. F. Flame Tests Using Improvised
2317−2341. Alcohol Burners. J. Chem. Educ. 1999, 76 (7), 929−930.
(23) Adadan, E.; Savasci, F. An Analysis of 16−17-Year-Old Students’ (52) Johnson, K. A.; Schreiner, R. A Dramatic Flame Test
Understanding of Solution Chemistry Concepts Using a Two-Tier Demonstration,. J. Chem. Educ. 2001, 78 (5), 640−641.
Diagnostic Instrument. Int. J. Sci. Educ. 2012, 34 (4), 513−544. (53) Sanger, M. J. Flame tests: Which Ion Causes the Color? J. Chem.
(24) Linke, R. D.; Venz, M. I. Misconceptions in Physical Science Educ. 2004, 81 (12), 1776A−1776B.
among Non−Science Background Students. Res. Sci. Educ. 1978, 8, (54) Sanger, M. J.; Phelps, A. J.; Banks, C. Simple Flame Test
Techniques Using Cotton Swabs. J. Chem. Educ. 2004, 81 (7), 969−
183−193.
(25) Dick-Perez, M.; Luxford, C. J.; Windus, T. L.; Holme, T. A. A 970.
(55) Mortier, T.; Wellens, A.; Janssens, M.−J. Inexpensive Alcohol
Quantum Chemistry Concept Inventory for Physical Chemistry
Burners for Flame Tests Using Aluminum Tea Light Candle Holders.
Classes. J. Chem. Educ. 2016, 93 (4), 605−612.
J. Chem. Educ. 2008, 85 (4), 522.
(26) Brandriet, A. R.; Bretz, S. L. The Development of the Redox
(56) Vitz, E. Demonstration Extensions: Flame Tests and
Concept Inventory as a Measure of Students’ Symbolic and Particulate
Electrolysis. J. Chem. Educ. 2008, 85 (4), 522.
Redox Understandings and Confidence. J. Chem. Educ. 2014, 91 (8), (57) Landis, A. M.; Davies, M. I.; Landis, L.; Thomas, N. C. Magic
1132−1144. Eraser” Flame Tests. J. Chem. Educ. 2009, 86 (5), 577−578.
(27) Villafañe, S.; Heyen, B. J.; Lewis, J. E.; Loertscher, J.; (58) Maines, L. L.; Bruch, M. D. Identification of Unkown Chlroide
Minderhout, V.; Murray, T. A. Design and Testing of an Assessment Salts Using a Comination of Qualitative Analysis and Titration with
Instrument to Measure Understanding of Protein Structure and Silver Nitrate: A General Chemistry Laboratory,. J. Chem. Educ. 2012,
Enzyme Inhibition in a New Context. Biochem. Mol. Biol. Educ. 2012, 89 (7), 933−935.
44, 179−190. (59) Neel, B.; Crespo, G. A.; Perret, D.; Cherubini, T.; Bakker, E.
(28) Bretz, S. L.; Linenberger, K. J. Development of the Enzyme− Camping Burner-Based Flame Emission Spectrometer for Classroom
Substrate Interactions Concept Inventory. Biochem. Mol. Biol. Educ. Demonstrations. J. Chem. Educ. 2014, 91 (1), 1655−1660.
2012, 40 (4), 229−233. (60) Yu, H. L. L.; Domingo, P. N., Jr.; Yanza, E. R. S.; Guidote, A. M.,
(29) Oser, J. I. Flame Tests. J. Chem. Educ. 1928, 5 (2), 192. Jr. Making a Low-Cost Soda Can Ethanol Burner for Out-of-
(30) Clark, A. R. The Test-Tube Method for Flame Testing. J. Chem. Laboratory Flame Test Demonstrations and Experiments. J. Chem.
Educ. 1935, 12 (5), 242−243. Educ. 2015, 92 (1), 127−128.
(31) Clark, A. R. Test-Tube Flame Test Applied to the Rarer (61) Murphy, K.; Holme, T.; Zenisky, A.; Caruthers, H.; Knaus, K.
Elements. J. Chem. Educ. 1936, 13 (8), 383−384. Building the ACS Exams Anchoring Concept Content Map for
(32) Kiplinger, C. C. Paper for Paltinum in Flame Tests. J. Chem. Undergraduate Chemistry. J. Chem. Educ. 2012, 89 (6), 715−720.
Educ. 1941, 18 (6), 297. (62) Holme, T.; Murphy, K. The ACS Exams Institute Under-
(33) Anderson, H.; Corwin, J. F. A Simple Method of Demonstrating graduate Chemistry Anchoring Concepts Content Map I: General
Flame Tests. J. Chem. Educ. 1947, 24 (9), 443. Chemistry. J. Chem. Educ. 2012, 89 (6), 721−723.
(34) Brown, J. A. Lacquer Color Filters for Qualitative Flame Tests. J. (63) Holme, T. A.; Luxford, C. J.; Murphy, K. L. Updating the
Chem. Educ. 1953, 30 (7), 363−364. General Chemistry Anchoring Concepts Content Map. J. Chem. Educ.
(35) Strong, III F.C. Improving Potassium Flame Tests. J. Chem. 2015, 92, 1115−1116.
Educ. 1969, 46 (3), 178. (64) Price, D. S.; Brooks, D. W. Extensiveness and Perceptions of
(36) Smith, D. D. Producing Flame Spectra. J. Chem. Educ. 1979, 56 Lecture Demonstrations in the High School Chemistry Classroom.
(1), 48. Chem. Educ. Res. Pract. 2012, 13, 420−427.
(37) Pearson, R. S. An Improved Calcium Flame Test. J. Chem. Educ. (65) Körhasan, J. D.; Wang, L. Students’ Mental Models of Atomic
1985, 62 (7), 622. Spectra. Chem. Educ. Res. Pract. 2016, 17, 743−755.
(38) Bouher, J. H. Capillary Tube Flame Test. J. Chem. Educ. 1986, (66) Ausubel, D. Educational Psychology: A Cognitive View; Holt,
63 (2), 158. Rinehart and Winston: New York, 1968.
(39) Ager, D. J.; East, M. B.; Miller, R. A. Vivid Flame Tests. J. Chem. (67) Novak, J. D. Human Constructivism: A Unification of
Educ. 1988, 65 (6), 545−546. Psychological and Epistemological Phenomena in Meaning Making.
(40) Gouge, E. M. A. Flame Test Demonstration Device. J. Chem. Int. J. Pers. Constr. Psych. 1993, 6 (2), 167−193.
Educ. 1988, 65 (6), 544−545. (68) Ye, L.; Lewis, S. E. Looking for Links: Examining Student
(41) Peyser, J. R.; Luoma, J. R. Flame Colors Demonstration. J. Responses in Creative Exercises for Evidence of Linking Chemistry
Chem. Educ. 1988, 65 (5), 452−453. Concepts. Chem. Educ. Res. Pract. 2014, 15, 576−586.
(42) Mattson, B. M.; Snipp, R. L.; Michels, G. D. Spectacular (69) Bretz, S. L. Human Constructivism and Meaningful Learning. J.
Classroom Demonstration of the Flame Test for Metal Ions. J. Chem. Chem. Educ. 2001, 78 (8), 1107.
Educ. 1990, 67 (9), 791. (70) Grove, N. P.; Bretz, S. L. A Continuum of Learning: From Rote
(43) Barnes, Z. K. Alternative Flame Test Procedures. J. Chem. Educ. Memorization to Meaningful Learning in Organic Chemistry. Chem.
1991, 68 (3), 246. Educ. Res. Pract. 2012, 13, 201−208.

26 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27
Journal of Chemical Education Article

(71) Johnstone, A. H. You Can’t Get There from Here. J. Chem.


Educ. 2010, 87 (1), 22−29.
(72) Johnstone, A. H. Why Is Science Difficult To Learn? Things Are
Seldom What They Seem. J. Cmptr. Assist. Lrng. 1991, 7 (2), 75−83.
(73) Towns, M. H. Mixed Methods Designs in Chemical Education
Research. In Nuts and Bolts of Chemical Education Research; Bunce, D.
M., Cole, R. S., Eds.; American Chemical Society: Washington, DC,
2008; pp 135−148.
(74) Bretz, S. L. Qualitative Research Designs in Chemistry
Education Research. In Nuts and Bolts of Chemical Education Research;
Bunce, D. M., Cole, R. S., Eds.; American Chemical Society:
Washington, DC, 2008; pp 79−96.
(75) Phelps, A. J. Qualitative Methodologies in Chemical Education
Research: Challenging Comfortable Paradigms. J. Chem. Educ. 1994,
71 (3), 191−194.
(76) Geertz, C. Thick Description: Toward an Interpretive Theory of
Culture. In The Interpretation of Cultures: Selected Essays; Geertz, C.,
Ed.; Basic Books: New York, 1973; pp 3−30.
(77) Linenberger, K. J.; Bretz, S. L. A Novel Technology To
Investigate Students’ Understanding of Enzyme Representations. J.
Coll. Sci. Teach. 2012, 42 (1), 45−49.
(78) Bowen, C. W. Think-Aloud Methods in Chemistry Education:
Understanding Student Thinking. J. Chem. Educ. 1994, 71 (3), 184−
190.
(79) Livescribe. http://www.livescribe.com/en-us/ (accessed Oct
2017).
(80) White, R.; Gunstone, R. F. Probing Understanding; Falmer:
London, 1992.
(81) QSR International. http://www.qsrinternational.com/product
(accessed Oct 2017).
(82) Creswell, J. W. Qualitative Inquiry & Research Design: Choosing
among Five Approaches; Sage Publications: Thousand Oaks, CA, 2007.
(83) Mayo, A. V. Atomic Emission Misconceptions As Investigated
through Student Interviews and Measured by the Flame Test Concept
Inventory. Doctoral dissertation. Miami University, Oxford, OH, 2012.
https://etd.ohiolink.edu/pg_10?0::NO:10:P10_ACCESSION_
NUM:miami1362754897 (accessed Oct 2017).
(84) SPSS. https://www.ibm.com/analytics/us/en/technology/spss/
(accessed Oct 2017).
(85) Leighton, J. P.; Heffernan, C.; Cor, M. K.; Gokiert, R. J.; Cui, Y.
An Experimental Test of Student Verbal Reports and Teacher
Evaluations as a Source of Validity Evidence for Test Development.
Appl. Measurement in Educ. 2011, 24 (4), 324−348.
(86) McKelvie, S. J. Does Memory Contaminate Test−Retest
Reliability? J. Gen. Psych. 1992, 119 (1), 59−72.
(87) Trochim, W. M. K. Measurement Validity Types. Research
Methods Knowledge Base. http://www.socialresearchmethods.net/kb/
measval.php (accessed Oct 2017).
(88) Lewis, S. E.; Lewis, J. E. The Same or Not the Same:
Equivalence as an Issue in Educational Research. J. Chem. Educ. 2005,
82 (9), 1408−1412.
(89) Razali, N. M.; Wah, Y. B. Power Comparisons of Shapiro-Wilk,
Kolmogorov-Smirnov, Lilliefors and Anderson-Darling tests. J. Stat.
Model. and Analytics 2011, 2 (1), 21−33.
(90) Ding, L.; Beichner, R. Approaches to Data Analysis of Multiple-
Choice Questions. Phys. Rev. ST Phys. Educ. Res. 2009, 5 (2), 020101−
02010317.
(91) Cronbach, L. J. Coefficient Alpha and the Internal Structure of
Tests. Psychometrika 1951, 16 (3), 297−334.
(92) Bretz, S. L.; McClary, L. M. Students’ Understandings of Acid
Strength: How Meaningful Is Reliabilty When Measuring Alternative
Conceptions? J. Chem. Educ. 2015, 92 (2), 212−219.
(93) Streiner, D. L. Starting at the Beginning: An Introduction to
Coefficient Alpha and Internal Consistency. J. Pers. Assess. 2003, 80
(1), 99−103.
(94) Crocker, L. M.; Algina, J. Introduction to Classical and Modern
Test Theory; Holt, Rinehart, and Winston: New York, 1986.

27 DOI: 10.1021/acs.jchemed.7b00594
J. Chem. Educ. 2018, 95, 17−27

You might also like