Professional Documents
Culture Documents
Making Inquiry-Based Science Learning Visible: The Influence of CVS and Cognitive Skills On Content Knowledge Learning in Guided Inquiry
Making Inquiry-Based Science Learning Visible: The Influence of CVS and Cognitive Skills On Content Knowledge Learning in Guided Inquiry
To cite this article: Anita Stender, Martin Schwichow, Corinne Zimmerman & Hendrik Härtig
(2018): Making inquiry-based science learning visible: the influence of CVS and cognitive skills
on content knowledge learning in guided inquiry, International Journal of Science Education, DOI:
10.1080/09500693.2018.1504346
Article views: 8
a
Department of Physics Education, University of Duisburg-Essen, Duisburg, Germany; bDepartment of Physics
Education, University of Education Freiburg, Freiburg im Breisgau, Germany; cDepartment of Psychology,
Illinois State University, Normal, IL, USA
Science is not just a body of knowledge that reflects current understanding of the world; it
is also a set of practices used to establish, extend, and refine that knowledge (National
Research Council, 2012). This twofold meaning of science is reflected in science curricula
of various countries (e.g. USA: NGSS Lead States, 2013; Germany: KMK, 2005). As wide-
spread access to scientific knowledge is available to anyone with Internet access, the ability
to find, interpret and evaluate the quality of the resources used for generating knowledge is
becoming increasingly important. Many reforms and innovations in science education
focus on scientific reasoning and argumentation skills, as they are crucial for participating
Inquiry learning
For at least three decades, the term inquiry learning has been used in science education as a
synonym for ‘good and meaningful’ science teaching (Abd-El-Khalick et al., 2004; Ander-
son, 2002). The assertion that students should learn science by mimicking the process of
knowledge construction in science is not new (Dewey, 1910) and has been a leading idea in
science education reforms since the 1950s (Chiappetta, 1997). In science education,
inquiry learning describes an approach in which students learn by actively using scientific
methods to answer research questions (Anderson, 2002; Bell et al., 2005; Chiappetta,
1997). This definition implies two minimum requirements for characterising an activity
as inquiry learning: (a) students answer research questions by (b) applying scientific
methods. Accordingly, activities such as building models of atoms or assembling wildfl-
ower collections are not inquiry-learning activities because students are not answering
research questions. Collecting information in a library or on the Internet to find out
how liquid oxygen is produced is not an inquiry-learning activity because even though stu-
dents are answering a question, they are not applying scientific methods. This analysis of
what counts as ‘inquiry learning’ does not imply that these activities are not important, it
just means that they are not sufficient to meet a strict definition of inquiry learning (Bell
et al., 2005).
However, the whole inquiry learning cycle of asking research questions, collecting data,
and interpreting the results is not always used in science classes. In praxis, activities vary.
In some cases, the teacher provides the research questions, methods of data collection, and
interpretation; in other cases, the teacher leaves the responsibility for some or all of these
components of inquiry learning to students. Bell et al. (2005) define four levels of inquiry
learning based on the amount of guidance from the teacher (Table 1). In verification (level
0) the teacher provides the research questions, methods of data collection, and guidance
for how to interpret the data to answer the research questions. Level 0 activities are tra-
ditional ‘cookbook-like’ activities that provide a recipe for answering research questions.
In structured inquiry learning (level 1) students are provided with the research questions
and the data collection methods but are not given guidance on how to interpret the data.
In guided inquiry learning (level 2) students are given the research questions but no further
guidance. In open inquiry learning (level 3) students choose a data collection method and
interpret data to answer their own research questions without any guidance. This rubric
captures the varying levels of complexity of inquiry learning in classroom settings and
the amount of guidance provided. Hence, younger and less experienced students should
work on lower-level activities, whereas students with more experience should work on
more complex inquiry-learning activities to practice and improve their scientific process
skills. Researchers can also use this rubric to interpret evidence about the effect of
inquiry learning, as it provides criteria to systematically compare different inquiry-learn-
ing activities used in classrooms or labs.
learning outcomes as well as the process quality measures did not differ between different
types of guidance, whereas the quality of the products created during inquiry was greater
when students received a specific explanation of how to perform an action (Lazonder &
Harmsen, 2016). Providing students with explanations for science processes during struc-
tured or guided inquiry (level 1 and 2) might, therefore, be an important step for students to
generate an adequate understanding of the conceptual content under investigation (Lazon-
der & Harmsen, 2016). To further examine this assumption, we describe empirical findings
regarding the impact of students’ understanding and application of science process skills
during guided inquiry activities on their learning of science content.
the conceptual and procedural demands of CVS (Chen & Klahr, 1999; Klahr, Chen, &
Toth, 2001; Künsting, Thillmann, Wirth, Fischer, & Leutner, 2008; McElhaney, 2011).
However, these studies only coded whether students showed a change in beliefs regarding
the causal status of the variables under investigation and not if students could transfer or
generalise this newly acquired knowledge to related scientific concepts. For example, when
learning how to control variables to investigate the effect of mass on the period of a pen-
dulum, the speed of a sinking object, or the distance an object travels down a ramp, does
this belief change about mass translate into more scientifically accurate conceptions about
the physics of mass? It is an open question whether students can generalise the belief
changes that occur about the causal status of the variables under investigation into a
more robust understanding of the science content.
Cognitive skills
Scientific reasoning skills are crucial for inquiry learning, but factors that underlie individ-
ual differences are largely unknown (Chen & Klahr, 1999; Klahr et al., 2001; Künsting
et al., 2008; McElhaney, 2011). General information-processing skills such as analogical
reasoning, encoding, reading skills, and strategy development seem to be necessary to
coordinate scientific reasoning processes (Morris, Croker, Masnick, & Zimmerman,
2012; van der Graaf, Segers, & Verhoeven, 2016). These skills help students to acquire,
coordinate and store new information in memory.
There are two lines of research having investigated the relationship between scientific
reasoning and cognitive skills. One line of research has focused on whether scientific reason-
ing is a unique skill that is independent of other cognitive skills (Koerber, Mayer, Osterhaus,
Schwippert, & Sodian, 2015; Mayer, Sodian, Koerber, & Schwippert, 2014; Nehring, Nowak,
Belzen, & Tiemann, 2015). For example, Koerber et al. (2015) developed a 66-item scale that
assessed five different aspects of scientific thinking to investigate whether scientific reasoning
develops as a unitary skill or as multiple non-related sub-skills. In these studies, cognitive skills
(i.e. analogical reasoning and text comprehension) were measured to demonstrate the discri-
minant validity of measures of cognitive skills from measures of scientific reasoning. Text
comprehension was of interest because many scientific reasoning tests and inquiry-learning
materials require reading and writing. Reported correlations between analogical reasoning
and scientific reasoning ranged from r = .15 (Koerber et al., 2015) to r = .39 (Nehring et al.,
2015) and between reading skills and scientific reasoning from r = .17 (Nehring et al.,
2015) to r = .44 (Mayer et al., 2014).
A second line of research focuses on explaining individual differences in scientific
reasoning as a function of differences in cognitive skills (Osterhaus, Koerber, & Sodian,
2017; van der Graaf et al., 2016). These researchers propose that verbal skills are crucial
for the development of scientific reasoning skills because of the need to encode and
store information in memory via the verbal route (Baddeley, 2000). In contrast to the
research described above, cognitive skills do not serve as control variables but as variables
to test theories about the development of scientific reasoning. For example, van der Graaf
et al. (2016) report correlations between students’ performance on CVS tasks and their
scores on grammar and vocabulary tests. As these studies investigate the impact of mul-
tiple additional variables (e.g. understanding the nature of science, advanced theory of
mind) on individual differences in scientific reasoning, they typically use structural
6 A. STENDER ET AL.
equation modelling to consider covariance between all assessed variables. They report zero
and even negative factor loadings for the direct path between cognitive skills and scientific
reasoning skills. This pattern shows that even though cognitive abilities have medium-size
correlations with scientific reasoning skills, their effects are mediated rather than direct.
This finding is supported by results of the Munich longitudinal study, which shows posi-
tive relationships between cognitive skills and multiple scientific reasoning measures, but
also that early scientific reasoning skills are a better predictor of later scientific reasoning
than cognitive skills (Bullock, Sodian, & Koerber, 2009).
Research questions
As in many other contexts, ‘knowing that’ (i.e. declarative knowledge) is not the same as
‘knowing how’ (i.e. procedural knowledge). It is important, therefore, to differentiate
between having a conceptual understanding of the control-of-variables strategy (CVS
understanding) and having the ability to apply the control-of-variables strategy in a
hands-on task (CVS hands-on skills) (Marschner et al., 2012). Additionally, a meaningful
relationship between cognitive skills and scientific reasoning skills has been shown
(Koerber et al., 2015; Mayer et al., 2014; Nehring et al., 2015). As a consequence, cognitive
skills can also have a meaningful impact on the construction of knowledge during inquiry.
However, the common impact of cognitive skills, conceptual understanding of CVS, and
CVS hands-on skills on science content knowledge during guided inquiry learning has not
been empirically demonstrated. In the current study, we report a re-analysis of the data
from a study that investigated the differential effects of hands-on versus paper-and-
pencil training on students’ learning of CVS skills (Schwichow, Croker, Zimmerman, &
Härtig, 2016). This current analysis differs from the previous study in that we address
the following research questions:
(1) To what extent does the learning of content knowledge during an inquiry-learning
activity depend on scientific reasoning skills (i.e. CVS understanding, CVS hands-
on skills) and cognitive skills (i.e. analogical reasoning, reading ability)?
(2) Is a conceptual CVS understanding sufficient for successful learning of content knowl-
edge in a guided-inquiry activity, or are CVS hands-on skills necessary as well?
As scientific reasoning skills have been shown to support learning of content knowledge
in guided inquiry-learning activities (Chen & Klahr, 1999; Künsting et al., 2008; McElha-
ney, 2011), we assume for the first research question, that CVS skills should have a higher
impact on content learning than cognitive skills. For the second research question, we pre-
dicted that CVS hands-on skills would be a more important contributor to content knowl-
edge learning than CVS understanding.
Method
Participants
This study is based on a reanalysis of data from an intervention study investigating the differ-
ential effects of hands-on and paper-and-pencil training (Schwichow, Zimmerman, Croker,
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 7
& Härtig, 2016). The original study assessed control-of-variables strategy skills (CVS under-
standing and CVS hands-on skills) at the group level (hands-on vs. paper-and-pencil
training); this reanalysis investigates the impact of cognitive skills and CVS skills on
physics content knowledge gains at the individual level. A total of 189 students (aged 12
to 15 years, 54% female) from eight classes of two comprehensive schools in a suburban
area in northern Germany participated in the original study. As data collection occurred
over two days we only have complete datasets for 161 students. Both schools were
academically diverse, including students with special educational needs as well as students
approaching a university entrance exam. Both schools enrol equivalent numbers of low-,
medium-, and high-achieving students according to their elementary school reports.
Specific demographic information regarding student ethnicity, socioeconomic status, and
school achievement was not collected due to a lack of permission to collect this information.
students’ experimental designs. After conducting each experiment, students had to choose,
based on their observation of the experimental contrast, whether the hypothesis was sup-
ported or not.
It is important to note that the CVS training phase and the individual experimentation
phase have the characteristics of guided inquiry learning. As students answered provided
research question by drawing inferences from experiments they created and without step-
by-step instructions, these tasks can be characterised as level 2 guided inquiry (Bell et al.,
2005). Students received no further guidance, feedback, or information about the expected
outcome of the experiments.
Measures
CVS understanding
A multiple-choice pretest was administered before the learning activities to test students’
conceptual understanding of the control-of-variables strategy (CVS). All 23 items involved
CVS problems in the context of heat/temperature or electricity/electromagnets, which are
two content domains that are part of the state science curriculum for middle schools in
northern Germany. The items were designed based on the theoretical conceptualisation
of CVS by Chen and Klahr (1999) and cover the three CVS sub-skills: identifying con-
trolled experiments, interpreting the outcome of controlled experiments, and interpreting
the outcome of confounded experiments (i.e. indeterminacy). For items about identifying
controlled experiments, students had to identify the one controlled experiment from a set
of four experiments (including three confounded experiments). For items about interpret-
ation, students were presented with a drawing of either a controlled or a confounded
experiment. For controlled experiments, students had to draw the correct conclusion
based on the presented outcome. For confounded experiments, they had to choose that
no valid conclusion could be drawn from this experiment (see https://goo.gl/whzRR8
for items and Schwichow, Christoph, Boone, and Härtig (2016) for results of a validation
study). We constructed three different test booklets to prevent students from copying
answers from other students. Each booklet consisted of 12 out of 23 possible items: six
heat/temperature items and six electricity/electromagnetism items. Six of each of the
two problem types were included. There was an overlap between the test booklets of at
least six items. All items presented experiments with three independent variables (two
levels each). Students received one point for every correct answer.
Data analysis
The aim of the reanalysis was to identify the common effect of cognitive skills, CVS under-
standing, and CVS hands-on skills on physics content knowledge gains. Therefore, we
simultaneously analysed the effect of cognitive skills (reading ability and analogical
reasoning) and inquiry skills (CVS understanding and CVS hands-on skills) on student
gains in content knowledge by analysing the data using structural equation modelling
(SEM). We utilised full information maximum likelihood (FIML) to deal with missing
values and robust maximum likelihood estimators to calculate parameters and standard
errors (Kaplan, 2005). The SEM models were calculated with the lavaan-package in R
(Rosseel, 2012).
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 11
Scaling
To create scales for reading comprehension, reading speed, and figural analogies we trans-
formed raw values into t-values. Based on the sum of correct responses we assigned to
every student a corresponding t-value according to the grade-specific transformation
tables in the test manuals. However, as we do not have equivalent information about
the population distribution of students’ abilities on other measures (CVS, content knowl-
edge) we used a one-dimensional Rasch model to create scales for these variables (see
Table 2). The Rasch model estimates student abilities and item difficulties based on
student answers by calculating the probability of solving an item for every item and
every student. To create unrestricted scales not limited to probabilities of 0 to 1 probability
measures are transformed to a logit scale extending from negative infinity to positive
infinity. Lower logit values represent easier items or less able students and higher values
correspond to more challenging items or more able students (Boone, Staver, & Yale,
2014). We calculated CVS understanding and content knowledge person measures as
weighted likelihood estimations (Warm, 1989) in the TAM-package in R (Kiefer,
Robitzsch, & Wu, 2017). We did not utilise Rasch-scaling for the CVS-hands-on items,
as WLE estimates based on four items are highly unreliable. Instead, we calculated a
latent CVS hands-on factor in the SEM model.
Figure 2. Proposed SEM models for comparing the influence of CVS skills, analogical reasoning, and
reading abilities on content knowledge gains.
Note: CVS means control-of-variables strategy. The arrows represent directed effects from one variable to another; oval
shapes represent latent variables, whereas rectangular shapes represent observed variables. Measurement error terms
and model specification error terms are omitted for simplicity.
reasoning and reading skills) are necessary or sufficient for learning content knowledge in
inquiry learning settings we compared three proposed SEM Models (Figure 2). We assume
in the CVS model, that only CVS understanding and CVS hands-on skills influence gains
in physics content knowledge. In the Analogical reasoning and reading model, students’
gains in content knowledge are predicted only by reading and analogical reasoning. In
the Analogical reasoning, reading, and CVS model, we assume that analogical reasoning,
reading skills, CVS understanding, and CVS hands-on skills predict content knowledge
gains. The empirical decision as to which model best fit the data structure was made on
the basis of Root Mean Square Error of Approximation (sufficient fit RMSEA < .06) as
the overall fit index and the Comparative Fit Index (sufficient fit CFI ≥ .95) and the
Tucker-Lewis Index (sufficient fit TLI ≥ .95) as incremental fit indices. Models which
sufficient fit indices and / or higher values on fit indices are more sufficient for explaining
the data structure than models with insufficient or lower values on fit indices (Schreiber,
Nora, Stage, Barlow, & King, 2006).
Results
The aim of this study was to estimate the effect of cognitive skills (reading ability and ana-
logical reasoning) and scientific reasoning (understanding and hands-on skills of the
control-of-variables strategy (CVS)) on gains in physics content knowledge. To get an
overview of the data structure we first analysed pairwise correlations between all assessed
variables (Table 3). Concerning our first research question the following correlation
pattern is interesting: CVS understanding correlates with CVS hands-on (r = .35, p
< .001) and gain scores in physics knowledge only correlate with measurements of CVS
(CVS understanding: r = .18, p =.02; CVS hands-on: r = .17, p =.03) but not with
reading or analogical reasoning. The pattern of pairwise correlations provides evidence
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 13
that students’ learning of content knowledge in our guided inquiry tasks is linked to scien-
tific reasoning skills and not to cognitive skills. However, pairwise correlations do not con-
sider the effects of multiple covariates on the dependent variable and thus further analysis
is necessary. To this end, we compared three structural equation models (Figure 2).
An examination of the fit indices (Table 4) yields an insufficient fit of the Analogical
reasoning and reading model to the data. The CVS model has a good fit to the CFI, but
an insufficient fit with respect to the TLI and the RMSEA. In contrast, the Analogical
reasoning, reading, and CVS model show a good fit to the data according to all indices.
Figure 3 shows the path diagram for the Analogical reasoning, reading, and CVS
model. We represent only the significant paths in Figure 3, although all possible direct
and indirect effects (see Figure 2) of reading abilities, analogical reasoning, CVS under-
standing, and CVS hands-on skills on content knowledge gain were analysed in the model.
Our first research question concerned whether physics content knowledge gains result
from general cognitive skills or scientific reasoning skills. Two results indicate that analogical
reasoning and reading skills are not sufficient for successfully learning content in inquiry-
based learning environments. First, the Analogical reasoning and reading model did not
adequately fit the data structure (Table 4). Second, the results of the analogical reasoning,
reading, and CVS model (Figure 3) show that analogical reasoning and reading ability
have no direct effect on content knowledge gains (indicated by no arrows from analogical
reasoning and reading ability to content knowledge gains in Figure 3). We only found a
medium-sized direct effect of these variables on CVS understanding (Analogical reasoning:
g = .35, p , .001; Reading ability g = . 36, p , .001). Furthermore, analogical
reasoning had a weak direct effect on CVS hands-on skills (g = .19, p = .04).
Figure 3. Analogical reasoning, reading ability, and CVS model. The model is specified according to
model 3 in Figure 2 with all direct and indirect paths on content knowledge gains. Manifested variables
are in rectangles, and latent factors are in ellipses. Arrows that are missing compared to proposed model
in Figure 2 are non-significant paths (p > 0.05). The residual variance components (measurement error
variance is indicated by small arrows on latent variables) indicate the amount of unexplained variance.
Thus, for every observed Variable, R² = 1-error variance. The path coefficients associated with bold
arrows represent standardised estimates, whereas the coefficients listed next to light-face arrows rep-
resent standardised factor loadings. Levels of confidence are *p < .05; **p < .01; ***p < .001.
Our second research question addressed whether CVS hands-on skills are needed in
addition to CVS conceptual understanding for knowledge gains. CVS hands-on skill
had a direct effect on content knowledge gain (g = .22, p = .04). We found no direct
effect of conceptual understanding of CVS on content knowledge gain. However, we
did identify a mediation effect of CVS understanding via CVS hands-on skills on
content knowledge gain (CVS understanding –> CVS hands-on; g = . 26, p = .008
and CVS hands-on –> content knowledge gain; g = .22, p = .04).
To further illustrate the importance of CVS hands-on skills for learning content knowl-
edge we compared students who mastered CVS skills (conducting three or four out of four
controlled experiments, n = 121) to students who ran two or fewer controlled experiments
(n = 69). We rescaled logit values for the Rasch-model to the PISA-metric of a mean of 500
and SD of 100. The difference between CVS mastery students (M = 510; SD = 101) and
CVS non-mastery students (M = 477; SD = 95) was significant, t(189) = 1.96, p = .05.
Thus students whose facility with CVS enabled them to conduct controlled experiments
showed greater gains in content knowledge relative to students with less well-developed
hands-on CVS skills.
Discussion
In this study, we investigated which skills students need to achieve content knowl-
edge gains in an inquiry-based learning environment. We utilised structural
equation modelling in order to consider the covariance structure between the
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 15
assessed variables. We first discuss the effect of CVS skills versus cognitive skills
(reading skills and analogical reasoning) on students’ content knowledge learning.
Second, we discuss our results regarding the effect of CVS understanding (i.e. con-
ceptual) versus CVS hands-on skills (i.e. procedural) on content knowledge learn-
ing. Third, we address the importance of inquiry activities that offer students
guidance and support. Lastly, we consider the implications of our results for
science education.
Second, introducing students to both the conceptual and procedural aspects of the
control-of-variables strategy and other scientific process skills is a requirement for the
effective learning of content knowledge from guided inquiry. To master such skills stu-
dents need both an introduction to the concepts (e.g. through the cognitive conflict tech-
nique we employed) and training. The results of previous studies show that students can
be trained using paper-and-pencil, computer, or hands-on tasks as long as the training
addresses all relevant sub-skills (Klahr, Triona, & Williams, 2007; Schwichow et.al., 2016).
Finally, to construct effective inquiry-learning activities teachers have to align students’
current scientific reasoning skills and the demands of a particular learning activity. For
example, students could be trained on the required skills, and given the opportunity to
practices those skills. Additionally, teachers can adapt the demands of the inquiry activity
to students’ current skill level. The levels of inquiry outlined by Bell et al. (2005) provide
ways to alter teacher guidance and support during inquiry activities, such as providing
research questions or data. Unguided inquiry (level 3), in which students must perform
and coordinate all components of inquiry without scaffolding, requires practice and exper-
tise. Such activities should be the end and not the starting point of learning trajectories for
inquiry learning.
Disclosure statement
No potential conflict of interest was reported by the authors.
ORCID
Anita Stender http://orcid.org/0000-0001-6478-4762
Martin Schwichow http://orcid.org/0000-0001-9694-7183
Hendrik Härtig http://orcid.org/0000-0002-6171-9284
References
Abd-El-Khalick, F., BouJaoude, S., Duschl, R., Lederman, N. G., Mamlok-Naaman, R., Hofstein, A.,
… Tuan, H.-l. (2004). Inquiry in science education: International perspectives. Science
Education, 88(3), 397–419. doi:10.1002/sce.10118
Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruc-
tion enhance learning? Journal of Educational Psychology, 103(1), 1–18. https://doi.org/10.1037/
a0021017
Anderson, R. D. (2002). Reforming science teaching: What research says about inquiry. Journal of
Science Teacher Education, 13(1), 1–12. doi:10.1023/A:1015171124982
Baddeley, A. (2000). The episodic buffer: A new component of working memory? Trends in
Cognitive Sciences, 4(11), 417–423.
Baxter, G. P., & Shavelson, R. J. (1994). Science performance assessments: Benchmarks and surro-
gates. International Journal of Educational Research, 21(3), 279–298. doi:10.1016/S0883-0355
(06)80020-0
Bell, R. L., Smetana, L. K., & Binns, I. C. (2005). Simplifying inquiry instruction. The Science
Teacher, 72(7), 30–33.
Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. Dordrecht:
Springer.
Bullock, M., Sodian, B., & Koerber, S. (2009). Doing experiments and understanding science:
Development of scientific reasoning from childhood to adulthood. In W. Schneider & M.
18 A. STENDER ET AL.
Bullock (Eds.), Human development from early childhood to early adulthood. Findings from
the Munich longitudinal study (pp. 173–197). Mahwah, NJ: Lawrence Erlbaum.
Carolan, T. F., Hutchins, S. D., Wickens, C. D., & Cumming, J. M. (2014). Costs and benefits of
more learner freedom: Meta-analyses of exploratory and learner control training methods.
Human Factors, 56(5), 999–1014. doi:10.1177/0018720813517710
Chen, Z., & Klahr, D. (1999). All other things being equal: Acquisition and transfer of the control of
variables strategy. Child Development, 70(5), 1098–1120. doi:10.1111/1467-8624.00081
Chiappetta, E. L. (1997). Inquiry-based science: Strategies and techniques for encouraging inquiry
in the classroom. Science Teacher, 64(7), 22–26.
Croker, S., & Buchanan, H. (2011). Scientific reasoning in a real-world context: The effect of prior
belief and outcome on childrens hypothesis-testing strategies. British Journal of Developmental
Psychology, 29, 409–424.
Dewey, J. (1910). Science as subject-matter and as method. Science, 31(787), 121–127. doi:10.1126/
science.31.787.121
Dewey, J. (2002). Logik : Die theorie der forschung [logic: The theory of inquiry] (1st ed.). Suhrkamp:
Frankfurt am Main.
Embretson, S. E. (1991). A multidimensional latent trait model for measuring learning and change.
Psychometrika, 56(3), 495–515. doi:10.1007/BF02294487
Fischer, F., Kollar, I., Ufer, S., Sodian, B., Hussmann, H., Pekrun, R., … Eberle, J. (2014). Scientific
reasoning and argumentation: Advancing an interdisciplinary research agenda in education.
Frontline Learning Research, 28–45. doi:10.14786/flr.v2i3.96
Heller, K. A., & Perleth, C. (2000). Kognitiver fähigkeitstest für 4. Bis 12. Klassen [cognitive abilities
test of students from 4th to 12th grade]. Göttingen: Hogrefe.
Kaplan, D. (2005). Structural equation modeling: Foundations and extensions (Nachdr). Advanced
quantitative techniques in the social sciences: Vol. 10. Thousand Oaks, CA: Sage Publ.
Kiefer, T., Robitzsch, A., & Wu, M. (2017). TAM: Test Analysis Modules: R package version
1.99999-31. Retrieved from https://CRAN.R-project.org/package = TAM
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction
does not work: An analysis of the failure of constructivist, discovery, problem-based, experi-
ential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86. doi:10.1207/
s15326985ep4102_1
Klahr, D. (2000). Exploring science: The cognition and development of discovery processes.
Cambridge, MA: MIT Press.
Klahr, D., Chen, Z., & Toth, E. (2001). From cognition to instruction to cognition: A case study in
elementary school science instruction. In K. D. Crowley, C. D. Schunn, & T. Okada (Eds.),
Designing for science. Implications from everyday, classroom, and professional settings
(pp. 209–250). Mahwah, NJ: Lawrence Erlbaum Associates.
Klahr, D., Triona, L. M., & Williams, C. (2007). Hands on what? The relative effectiveness of phys-
ical versus virtual materials in an engineering design project by middle school children. Journal
of Research in Science Teaching, 44(1), 183–203. doi:10.1002/tea.20152
KMK. (2005). Bildungsstandards im fach physik für den mittleren schulabschluss: Beschluss vom
16.12. 2004. München: Wolters Kluwer Deutschland GmbH.
Koerber, S., Mayer, D., Osterhaus, C., Schwippert, K., & Sodian, B. (2015). The development of
scientific thinking in elementary school: A comprehensive inventory. Child Development, 86
(1), 327–336. doi:10.1111/cdev.12298
Koslowski, B. (1996). Theory and evidence: The development of scientific reasoning (1st ed.). learn-
ing, development, and conceptual change. Cambridge, MA: MIT Pr.
Künsting, J., Thillmann, H., Wirth, J., Fischer, H. E., & Leutner, D. (2008). Strategisches experimen-
tieren im naturwissenschaftlichen unterricht. Psychologie in Erziehung und Unterricht, 55(1), 1–15.
Retrieved from http://www.ruhr-uni-bochum.de/lehrlernforschung/website_eng/pdf/kuensting_
et_al_2008.pdf
Lawson, A. E., & Wollman, W. T. (1976). Encouraging the transition from concrete to formal cog-
nitive functioning-an experiment. Journal of Research in Science Teaching, 13(5), 413–430.
doi:10.1002/tea.3660130505
INTERNATIONAL JOURNAL OF SCIENCE EDUCATION 19
Lazonder, A. W., & Harmsen, R. (2016). Meta-Analysis of inquiry-based learning: Effects of gui-
dance. Review of Educational Research, 86(3), 681–718. doi:10.3102/0034654315627366
Lincare, M. J. (2002). What do infit and outfit, mean-square and standardized mean? Rasch
Measurement Transactions, 16(2), 878.
Linn, M. C., Clement, C., & Pulos, S. (1983). Is it formal if it’s not physics? The influence of content
on formal reasoning). Journal of Research in Science Teaching, 20(8), 755–770. doi:10.1002/tea.
3660200806
Marschner, J., Thillmann, H., Wirth, J., & Leutner, D. (2012). Wie lässt sich die experimentierstra-
tegie-nutzung fördern? [How can the use of strategies for experimentation be fostered? – A com-
parison of differently designed prompts]. Zeitschrift für Erziehungswissenschaft, 15(1), 77–93.
doi:10.1007/s11618-012-0260-5
Mayer, D., Sodian, B., Koerber, S., & Schwippert, K. (2014). Scientific reasoning in elementary
school children: Assessment and relations with cognitive abilities. Learning and Instruction,
29, 43–55. doi:10.1016/j.learninstruc.2013.07.005
McElhaney, K. W. (2011). Making controlled experimentation more informative in inquiry inves-
tigations (Dissertation). University of California, Berkeley, CA.
Minner, D. D., Levy, A. J., & Century, J. (2010). Inquiry-based science instruction-what is it and
does it matter? Results from a research synthesis years 1984 to 2002. Journal of Research in
Science Teaching, 47(4), 474–496. doi.org/10.1002/tea.20347
Morris, B. J., Croker, S., Masnick, A., & Zimmerman, C. (2012). The emergence of scientific reason-
ing. In H. Kloos (Ed.), Current topics in children’s learning and cognition (pp. 61–82). InTech.
doi.org/10.5772/53885
National Research Council. (2010). Exploring the intersection of science education and 21st century
skills: A workshop summary. Washington, DC: National Academies Press.
National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting
concepts, and core ideas. Washington, DC: The National Academies.
Nehring, A., Nowak, K. H., Belzen, A. U. z., & Tiemann, R. (2015). Predicting students’ skills in the
context of scientific inquiry with cognitive, motivational, and sociodemographic variables.
International Journal of Science Education, 37(9), 1343–1363. doi:10.1080/09500693.2015.
1035358
NGSS Lead States. (2013). Next generation science standards: For states, by states. Washington, DC:
The National Academies Press.
Osborne, J. (2013). The 21st century challenge for science education: Assessing scientific reasoning.
Thinking Skills and Creativity, 10, 265–279. doi:10.1016/j.tsc.2013.07.006
Osterhaus, C., Koerber, S., & Sodian, B. (2017). Scientific thinking in elementary school: Children’s
social cognition and their epistemological understanding promote experimentation skills.
Developmental Psychology, 53(3), 450–462. doi:10.1037/dev0000260
Popper, K. R. (1966). Logik der forschung [The logic of scientific discovery]. Tübingen: J.C.B. Mohr.
Ross, J. A. (1988). Controlling variables: A meta-analysis of training studies. Review of Educational
Research, 58(4), 405–437. doi:10.3102/00346543058004405
Rosseel, Y. (2012). Lavaan: An R Package for Structural Equation Modeling. Journal of Statistical
Software, 48(2), 1–36.
Ruiz-Primo, M. A., & Shavelson, R. J. (1996). Rhetoric and reality in science performance assess-
ments: An update. Journal of Research in Science Teaching, 33(10), 1045–1063.
Schneider, W., Schlagmüller, M., & Ennemoser, M. (2007). LGVT 6-12 lesegeschwindigkeits- und
-verständnistest für die klassen 6-12 [reading speed and reading comprehension test for grade 6
to 12]. Göttingen: Hogrefe.
Schreiber, J. B., Nora, A., Stage, F. K., Barlow, E. A., & King, J. (2006). Reporting structural equation
modeling and confirmatory factor analysis results: A review. The Journal of Educational
Research, 99(6), 323–338. doi:10.3200/JOER.99.6.323-338
Schwichow, M., Croker, S., Zimmerman, C., Höffler, T., & Härtig, H. (2016). Teaching the control-
of-variables strategy: A meta-analysis. Developmental Review, 39(1), S. 37–S. 63. doi:10.1016/j.dr.
2015.12.001
20 A. STENDER ET AL.
Schwichow, M., Zimmerman, C., Croker, S., & Härtig, H. (2016). What students learn from hands-
on activities. Journal of Research in Science Teaching (JRST), 53(7), S. 980–S.1002. doi:10.1002/
tea.21320
Solomon, J. (1994). The rise and fall of constructivism. Studies in Science Education, 23(1), 1–19.
doi:10.1080/03057269408560027
van der Graaf, J., Segers, E., & Verhoeven, L. (2016). Scientific reasoning in kindergarten: Cognitive
factors in experimentation and evidence evaluation. Learning and Individual Differences, 49,
190–200. doi:10.1016/j.lindif.2016.06.006
Warm, T. A. (1989). Weighted likelihood estimation of ability in item response theory.
Psychometrika, 54(3), 427–450. doi:10.1007/BF02294627
Zimmerman, C. (2007). The development of scientific thinking skills in elementary and middle
school. Developmental Review, 27(2), 172–223. Retrieved from http://www.cogsci.ucsd.edu/~
deak/classes/EDS115/ZimmermanSciThinkDR07.pdf
Zimmerman, C., & Croker, S. (2013). Learning science through inquiry. In G. J. Feist, & M. E.
Gorman (Eds.), Handbook of the psychology of science (pp. 49–70). New York: Springer
Publishing Company.