Professional Documents
Culture Documents
Juanjuan Chen Minhong Wang Tina A. Grotzer Chris Dede: Researcharticle
Juanjuan Chen Minhong Wang Tina A. Grotzer Chris Dede: Researcharticle
RESEARCH ARTICLE
1
KM&EL Lab, Faculty of Education, The
Abstract
University of Hong Kong, Hong Kong,
China The use of external representations has a potential to facilitate
2
Graduate School of Education, Harvard inquiry learning, especially in hypothesis generation and sci-
University, USA entific reasoning, which are typical difficulties encountered by
students. This study proposes and investigates the effects of a
Correspondence
Minhong Wang, KM&EL Lab, Faculty
three-dimensional thinking graph (3DTG) that allows learners
of Education, The University of Hong to combine in a single image, problem information, subject
Kong, Pokfulam Road, Hong Kong, knowledge (key concepts and their relationships), and the
China. hypothesizing and reasoning process involved in exploring a
Email: magwang@hku.hk problem, to support inquiry learning. Two classes of eleventh
grade students (97 in total) were randomly assigned to use
Funding information
either the 3DTG (i.e., experimental condition) or concept map
General Research Fund (No. 17201415)
from Research Grants Council (RGC) of
(i.e., control condition) to complete a group-based inquiry task
Hong Kong; U.S. National Science in an online environment. Data were collected from multiple
Foundation grant DRL 1416781 data sources, including measures of group inquiry task per-
formance, postknowledge scores, postquestionnaires of
student perceptions of consensus building, and open-ended
survey. The analysis of group task performance revealed that
participants in the experimental condition performed better in
the inquiry task than their counterparts specifically in generat-
ing hypotheses and reasoning with data, but not in drawing
conclusions. The findings show the 3DTG’s benefits in facili-
tating exploring and justifying hypotheses, and also suggest
that the 3DTG can share equal value with concept mapping in
terms of drawing conclusions and consensus building. In addi-
tion, students using the 3DTG achieved higher scores in the
postknowledge test, although no difference was found in their
perceptions of consensus building. These findings were vali-
dated by students’ responses to the survey. Implications of
this study and future work are also discussed.
KEYWORDS
collaborative learning, concept map, external representations, inquiry
learning, science education
Res Sci
J Res SciTeach.
Teach.2018;1–25.
2018;55:1239–1263. wileyonlinelibrary.com/journal/tea
wileyonlinelibrary.com/journal/tea © 2018V
C Wiley Periodicals,
2018 Wiley |
Inc. Inc. 1239
Periodicals, 1
|
21240 CHENET
CHEN ET AL.
AL.
1 | INTRODUCTION
Inquiry learning with real-world problems involves complex processes of collecting a variety of
information and data, integrating these data with various aspects of subject knowledge, and hypothesiz-
ing and reasoning with intertwined variables. Accordingly, there is a need for more studies investigat-
ing approaches to externalizing the complex, implicit aspects of inquiry tasks to facilitate deeper and
more meaningful learning in authentic situations (Wang, Kirschner, & Bridges, 2016). This study pro-
poses and investigates the effects of a Three-dimensional Thinking Graph (3DTG), which allows learn-
ers to combine in a single image, problem information, subject knowledge (key concepts and their
relationships), and the hypothesizing and reasoning process involved in exploring a problem, to sup-
port inquiry learning. The premise underlying the design is that externalizing the complex aspects of
the inquiry task makes it relatively easier for learners to successfully solve the problem (Janssen,
Erkens, Kirschner, & Kanselaar, 2010). Representational guidance on reasoning also makes the formu-
lation of hypotheses and logical reasoning salient (focusing student’s attention on these activities)
(Toth, Suthers, & Lesgold, 2002). This study therefore aims to explore the effects of the proposed
3DTG on inquiry learning in an online environment.
2 | THEORETICAL FRAMEWORK
to affect learners’ abilities to develop scientific understandings and conduct scientific investigations
(Lawson, Banks, & Logvin, 2007).
Students have difficulty reasoning about systems; particularly, they reveal inability to reason about
causality in a systemic sense (Grotzer & Bell Basca, 2003). They tend to reason at a local, not a global,
level, and miss the system-level interactions and the larger picture (Penner, 2000). Systemic explaining
and reasoning characterizes scientific sense-making. It involves comprehending a variety of causal pat-
terns such as domino-like, cyclic, or mutual patterns. Without a grasp of the underlying causal relation-
ships, students would easily develop misconceptions and superficial understanding of the systems. As
pointed out by Grotzer and Bell Basca (2003), local causal reasoning as opposed to domino-like rea-
soning is common in students’ thinking about ecosystem relationships, making the root cause of a
problem difficult to discern.
solutions better (i.e., reasoning) and demonstrated more cognitive and meta-cognitive activities
reflected in online discussions than those who did not create causal maps. In addition, evidence maps
have been developed to make the relationship between evidence (i.e., information or data) and a
hypothesis more explicit, and are conceptually oriented rather than problem-solving oriented (Suthers
et al., 2008; Toth et al., 2002). Suthers et al. reported that students using evidence maps generated
more hypotheses and were more likely to converge on the same conclusion. Conversely, the evidence
mapping tool employed by Toth et al. (2002) had a significantly positive effect on reasoning, but no
significant effect on hypothesis generation. Finally, Wang, Wu, Kinshuk, Chen, and Spector (2013)
proposed a computer-based integrated cognitive map that enabled students to externalize the reasoning
process on a reasoning map and the knowledge underlying the reasoning process on a concept map
when they worked with diagnostic problems. The approach showed promising effects on diagnostic
problem solving.
In summary, concept map is mainly used to visually represent the relationships between concepts
in domain knowledge (Ruiz-Primo & Shavelson, 1996). However, concept mapping alone has been
found to be inadequate in supporting complex problem-solving tasks, and particularly in eliciting and
representing the process of applying knowledge to practice (Wang, Cheng, Chen, Mercer, & Kirschner,
2017). Causal map is a kind of concept map, which can be used to represent causal relationship, a spe-
cific type of relationship between concepts (Slof et al., 2012). Evidence map goes beyond conceptual
knowledge by representing the reasoning process and building the relations between evidence and
hypothesis (Suthers et al., 2008).
Given that inquiry and problem-solving tasks have usually mixed all kinds of information and
data, concepts, and relationships, learning in such contexts often involves complex cognitive proc-
esses such as searching for information and data from multiple aspects, integrating information with
domain knowledge, and generating a solution or explanation to the problem by reasoning with data
that are intertwined in sophisticated ways. Many students have found it hard and cognitively
demanding. Therefore, there is a need for more studies to investigate how the complex cognitive
processes involved in inquiry and problem-solving tasks can be externalized and facilitated toward
desired task performance and learning outcomes. Here, we studied the use of the proposed 3DTG
that allows learners to combine in a single image, problem information, subject knowledge (key
concepts and their relationships), and the hypothesizing and reasoning process, to support inquiry
learning. Externalizing the three cognitive aspects (i.e., searching information and data, recalling and
applying domain knowledge, and hypothesizing and reasoning) in a holistic way is critical for suc-
cessful inquiry learning.
More specifically, when students construct concept maps, relationships between intertwined con-
cepts/variables will become clearer to them. Concept maps enable students to bring out important con-
cepts/variables that they probably would not have noticed (Markow & Lonning, 1998). The learners
will think about how different variables are related to each other. Second, compiling the problem infor-
mation and the changes in a set of key variables can make important aspects of the problem salient.
Table is an appropriate external representation of a series of information and data. Third, inspired by
the design of evidence map, we designed a reasoning map to incorporate the processes of hypothesis
generation and reasoning. The difference between evidence map used by Suthers et al. (2008) and our
reasoning map is that evidence map only incorporates one hypothesis, whereas our reasoning map rep-
resents a logical series of hypotheses to find the root cause instead of superficial cause of a problem.
Finally, integrating the three types of representations in a single image was inspired by Wang’s et al.
(2013) study on connecting problem-solving (i.e., reasoning map) and knowledge construction (i.e.,
concept mapping) in a single image, which has been proved to be promising in problem-solving
contexts.
|
61244 CHENET
CHEN ET AL.
AL.
FIGURE 1 Three-dimensional thinking graph (3DTG). [Color figure can be viewed at wileyonlinelibrary.com]
CHEN ET
CHEN AL.
ET AL.
|
1245
7
skills (Hmelo-Silver et al., 2007). In addition, its effect on students’ perceptions of consensus building
is explored because consensus building is an important but rarely examined inquiry activity. Research
has indicated that consensus building can be improved by student constructions of external representa-
tion (Gijlers & de Jong, 2013). Therefore, this study’s hypotheses are stated as follows:
Hypothesis 1 Students using the 3DTG will perform better in the inquiry task in terms of
formulating hypothesis, reasoning, and making a conclusion than those using only the
concept map.
Hypothesis 2 Students using the 3DTG will achieve higher scores on a knowledge test
than those using only the concept map.
Hypothesis 3 Students using the 3DTG will have more positive perceptions of consensus
building than those using only the concept map.
3 | METHODOLOGY
3.2 | Participants
Two 11th grade classes, from one high school, taught by the same biology teacher, were recruited and
randomly assigned to the experimental condition or the control condition. One class with 48 students
(24 males and 24 females) was assigned to the experimental condition using the proposed 3DTG, while
the other class with 49 students (24 males and 25 females) was assigned to the control condition using
a traditional concept mapping approach. In each condition, the students were divided into small groups
of three or four each, based on their friendships. In total, there were 15 experimental groups and 16
control groups. The average age for both classes was 17 (ranging from 16 to 18). Before the experi-
ment, the participants had already acquired basic knowledge about ecosystems.
FIGURE 2 Information collection and data observation. [Color figure can be viewed at wileyonlinelibrary.com]
organisms over a number of virtual “days,” and collect relevant data to investigate why many of the
fish had died off. The module contained information collection and data observation submodules, as
shown in Figure 2. The submodule of information collection provides a rich context, such as vivid
descriptions and visualizations of the surroundings and background information on the pond ecosystem
necessary to problem solving. By selecting a specific day from a dropdown list of dates, the user could
observe and collect information for that day, providing tacit clues or hints and guiding students in their
observation of the phenomena. Students were able to relate the observation to their prior knowledge.
The submodule of data observation facilitates problem exploration by depicting data graphs of the
key variables and the changes in variables over a period. For example, in the pond ecosystem, students
could observe data on water conditions (e.g., water temperature, turbidity, concentration of PO4 and
NO3, dissolved oxygen), weather conditions (e.g., air temperature, wind speed, cloud cover), and the
population of various organisms (e.g., bacteria, algae, bass). Moreover, the module enables the learner
to construct a query by selecting variables to facilitate the analysis and comparison of variables.
The learning support module afforded some learning guidelines. First, it included domain-specific
knowledge about ecosystems and ecological processes, and a list of key concepts regarding the prob-
lem context, such as the food web, the roles of biotic and abiotic factors, and the processes involved
(e.g., photosynthesis). Previous research has shown the importance of prior knowledge in inquiry activ-
ities (Shin, Jonassen, & McGee, 2003) and indicated the necessity of having sufficient contextual con-
cepts for productive problem solving (Gijbels, Dochy, Van den Bossche, & Segers, 2005). Without
sufficient prior knowledge, students might not have made good interpretations of the data.
CHEN ET
CHEN AL.
ET AL.
|
1247
9
Second, this module incorporated the introduction of basic skills and the steps required for scien-
tific inquiry, such as hypothesis formation and evidence-based reasoning. Instructional guidance on
structuring the inquiry process has proved effective (Kirschner et al., 2006; Linn & Songer, 1991;
Njoo & de Jong, 1993b; Quintana et al., 2004; White, 1993).
Third, the module provided general instructions on how to build concept maps, and how to con-
duct group discussions. Researchers have pointed out the necessity of providing learners with
instructions and explicit guidance in the use of tools and strategies (Stoyanov & Kommers, 2006).
Additionally, the environment for the experimental group offered guidelines for using the 3DTG.
Lastly, an example is afforded to both groups to demonstrate how to use the learning system.
Coaching also helped learners acquire a better understanding of the processes and principles of
inquiry learning.
3.5 | Procedure
The experiment lasted for six sessions over two weeks (45 min for each session). During the first ses-
sion consent forms were signed by the participants. A pretest questionnaire was administered to collect
information on the students’ age, gender, and computer skills. Groups of three or four were then
formed by students on the basis of their friendship.
At the beginning of the second session, the researcher provided a 20-min training to both the
experimental and control groups on how to perform inquiry learning in the online system. The train-
ing was conducted in the school’s computer lab, where the researcher explicitly introduced the prin-
ciples and procedures of inquiry learning in the form of an example. The principles of social
interaction, together with the supporting materials were also briefly introduced during the training
session. Additionally, students in the experimental condition were introduced to the approach to con-
structing a 3DTG, and students in the control condition were introduced to the approach to creating
a concept map.
After the training, students began to collaboratively perform the learning task in the school’s com-
puter lab, with group members sitting together and each one accessing a networked computer. To sup-
port or reject their initial hypotheses, they made observations, collected information, and formulated
hypotheses from multiple perspectives by brainstorming ideas based on the compiled evidence (i.e.,
information and data). During the group discussion and collaboration, each group member was able to
share his/her findings within the group and could be challenged by group peers. Finally, the group had
to reach an agreement on the best explanations for the problem. At the end of the fifth session, each
group was asked to submit an inquiry report synthesizing the group’s inquiry process (including the
10 |
1248 CHENET
CHEN ET AL.
AL.
collected information and data, generated hypothesis, reasoning, and conclusion). This lasted for three
and a half sessions.
After the fifth session, a paper-based two-item open-ended survey was administered to ascertain all
students’ perceptions of the inquiry activities by requiring students to write the responses to two open-
ended questions on the paper in about 15 min. Finally, in the sixth session, the participants were asked
to individually take a knowledge test (30 min) and a Likert-scale questionnaire (15 min). The open-
ended survey, the test, and the Likert-scale questionnaire took place in the participants’ classroom.
choice items of the pretest and post-test, respectively, showing that the two tests have high reliability.
The inter-rater reliability for nonmultiple-choice items was 0.81. Thus, the reliability of the two tests
can be considered acceptable.
1. Identification Inclusion of critical The ratio of collected information It rained heavily on July 6th.
of critical in- information and and data to the total relevant
formation and data information and data posted
data
2. Formulation of 2.1. Relevance or Is the formulated hypothesis im- The fish die-off might be caused
hypothesis importance portant to the cause analysis of by the deficiency of oxygen. E1
the pollution problem? (This is a highly relevant/impor-
1—irrelevant or unimportant, tant hypothesis.)
2—a little relevant, 3—relevant, The higher cloud cover led to the
4—very relevant or important, increase of NO3. E2 (This is an
5—highly or particularly irrelevant hypothesis.)
relevant/important
2.2. Validity Is the formulated hypothesis valid? The increase of PO4 and NO3 led
1—totally invalid, to the increase of PH value.
2—partially valid, 3—seemingly E3 (This hypothesis is totally
valid to some extent, invalid.)
4—valid, 5—absolutely valid
4. Reasoning and 4.1. Relevance or Is the reasoning and justification The oxygen level reduced to an
justification importance relevant to the analysis of the extremely low level (i.e., 4.1 mg/
pollution problem’s cause? How L), which caused the deficiency
well does the report provide of oxygen needed by the fish.
appropriate evidence, proof, or (One piece of highly relevant
examples to support the evidence supporting E1.)
hypothesis? 1—irrelevant,
2—a little relevant, 3—relevant,
4—very relevant, 5—highly/
particularly relevant
4.2. Correctness, Is the reasoning and justification (The piece of evidence provided in
reasonableness correct or reasonable? the above row supporting E1 is
1—little argument and evidence, correct.)
2—argument with irrelevant
evidence, 3—correct argument
with little evidence, 4—correct
argument supported by more
evidence, 5—correct argument
with sufficient correct evidence
T A BL E 1 (Continued)
appropriate for factor analysis (i.e., if KMO value is less than 0.5 or Bartlett’s test significant value is
greater than 0.05, factor analysis should not be applied). In the varimax-rotated factors, one factor was
extracted with an eigenvalue greater than 1 and it explained 71.035% of the total variance. All five
items achieved factor loading above 0.5 (with minimum being 0.662). Thus, no items were removed.
Regarding the reliability, Cronbach’s alpha value for the consistency evaluation of the consensus build-
ing construct was 0.944, indicating credible internal consistency.
(1) Factor Analysis was conducted to ensure the construct validity of consensus building question-
naire items. Cronbach’s a was run to assess the internal consistency of the questionnaire.
(2) Cohen’s Kappa was run to determine the inter-rater reliability (or agreement) in scoring the
inquiry report. The result exceeding 0.7 indicates substantial agreement, and 0.5 indicates moder-
ate agreement.
14 |
1252 CHENET
CHEN ET AL.
AL.
(3) One-way analysis of variance (ANOVA) was performed to evaluate statistical differences
between the two conditions for the pretest questionnaire, prior knowledge, and post-test question-
naire. Meanwhile, ANCOVA was used to explore the effects of the intervention on post-test
scores.
(4) Multivariate analysis of variance (MANOVA) was conducted to examine the statistical difference
between the two conditions in the aspects of the group task performance.
(5) A thematic content analysis of the written survey was performed to find common themes among
students’ responses to each open-ended question.
4 | RESULTS
T A BL E 3 MANOVA results of the group inquiry report and Levene’s test for equality of variances
* p < 0.05.
Hypothesis 1. The die-off of fish caused the increase of bacteria; Hypothesis 1.1. The death of green
algae caused the increase in the concentration of PO4; Hypothesis 1.1.1. The construction of the nearby
community had an effect on the fish die-off. There is not any causal relationship between Hypothesis 1
and Hypothesis 1.1, or between Hypothesis 1.1 and Hypothesis 1.1.1. A logical set of hypotheses
would give right and clear direction for reasoning and finding a root cause of the problem, which is
important for effective thinking and learning in inquiry contexts.
Thus, corresponding to Hypothesis 1, these results confirm the benefits of using the 3DTG to facil-
itate students’ inquiry task performance, more specifically to formulate their hypotheses, the suffi-
ciency of those hypotheses, and their reasoning using the data, but not in drawing conclusions. This
shows that the 3DTG fostered students’ skills in exploring and justifying their hypotheses.
With pretest score as the covariate, ANCOVA was then run to adjust for pretest score. The result
revealed that there was statistically significant difference between the experimental and control condi-
tions, F(1, 93) 5 9.036, p 5 0.003. Students in the experimental condition achieved higher scores in
the postknowledge test. Moreover, Cohen’s effect size index d (Cohen, 1992) was computed to illus-
trate the extent of practical difference between groups. Cohen’s d values of 0.2, 0.5, 0.8 are interpreted
as small, medium, and large effect sizes, respectively. The effect size reported in this study (d 5 0.54)
showed a medium effect of the intervention. This demonstrates the effectiveness of the 3DTG in
improving students’ knowledge. Hence, Hypothesis 2 is accepted.
4.4 | Postquestionnaire
Levene’s test verified the equality of variances in the two conditions (Levene’s value 5 0.001,
p 5 0.980). The means and standard deviations are as following: M 5 4.27 (SD 5 0.633) for the experi-
mental condition, and M 5 4.40 (SD 5 0.548) for the control condition. The ANOVA results indicated
no significant difference in students’ perception of group consensus building activities between the two
conditions (F(1, 79) 5 0.953, p 5 0.332). Thus, Hypothesis 3 is rejected.
T A BL E 6 Comparison of experimental and control group responses to the difficulties in solving the problem
Frequency
Experimental Control group
group (N 5 68) (N 5 67)
Themes/Categories Illustrative examples K (%) K (%)
3. Difficulty in investigating Could only find the surface cause; but 2 (3%) 0
the root cause could not find the root cause.
4. Difficulty in processing a lot There were so much information and 14 (20%) 9 (13%)
of intertwined and complex data that it was hard to process and
data synthesize them.
5. Difficulty in reasoning with The compiled evidence could not 14 (20%) 26 (39%)
data and justifying prove the hypothesis.
hypotheses
7. Challenge in group work It was hard for the diverse ideas of 3 (4%) 6 (9%)
group members to converge.
9. Others
9.1. Technical issue My computer did not work. 3 (4%) 1 (1%)
9.2. Time issue Need more time to complete the task 2 (3%) 0
9.3. Knowledge issue We had insufficient theoretical 1 (1%) 3 (4%)
knowledge.
Note. N 5 total number of responses (One participant might write more than one responses, or one response might covey more than
one themes). K 5 number of responses to each theme/category. % 5 the percentage of responses.
As seen in Table 7, regarding student responses to the second question on the benefits of the
3DTG (or concept map for the control condition), the most salient benefit was that the map made their
thinking and reasoning visible. In addition, the two groups shared similar views that using the maps
enabled them to hypothesize in a logical way and perform progressive reasoning.
However, there were some benefits reported only by the experimental group but absent in
the control group’s responses, including visualizing the hypothesizing process in a global
view, promoting reasoning globally, fostering reflection during the reasoning process, and
facilitating conclusion-making and communication. Another difference was that 21% of the
responses from the control group pointed out that concept mapping was not beneficial to their
18 |
1256 CHENET
CHEN ET AL.
AL.
T A BL E 7 Comparison of experimental and control group responses to the benefits of the mapping or thinking tool
Frequency
Experimental Control group
group (N 5 47) (N 5 33)
Themes/Categories Illustrative examples K (%) K (%)
1. Data collection
1.1. Visualizing the data Making the data visible 1 (2%) 0
1.2. Viewing the data in a global Facilitating the integration of the 1 (2%) 0
manner information
2. Hypothesis formulation
2.1. Making multiple hypotheses Recording our hypotheses; visualizing 4 (9%) 0
and their relations visible the relations between the hypotheses
2.2. Analyzing multiple hypoth- Facilitating our analysis of the fish 3 (6%) 0
eses in a big picture die-off problem from multiple di-
mensions
2.3. Hypothesizing in a logical Helping us clarify our thinking and 1 (2%) 1 (3%)
way find the right direction to explain
the problem
inquiry learning. In particular, one student in the control group mentioned that “Although con-
cept map facilitated the hypothesizing, it could not represent detailed information and
evidence.”
5 | DISCUSSION
The data analysis results for the overall inquiry task performance shows that the groups in the experi-
mental condition performed better than their counterparts in the control condition. This was consistent
with previous studies (Janssen et al., 2010; Slof, Erkens, Kirschner, & Helms-Lorenz, 2013; Suthers
et al., 2008), which also proved the effectiveness of representational support for group performance in
CHEN ET
CHEN AL.
ET AL.
|
1257
19
solving complex problems. The 3DTG provided learners with an overview of the entire inquiry task so
that they could easily access the task from different perspectives, that is, the problem’s information and
data, subject knowledge underlying the problem, and the logic behind the reasoning. This made it rela-
tively easier for learners to successfully solve the problem (Janssen et al., 2010).
More specifically, student groups using the 3DTG performed significantly better in almost
all aspects of the inquiry process except for making conclusions. First, in formulating hypothe-
ses, the experimental groups significantly outperformed their counterparts. One plausible expla-
nation might be that the 3DTG enabled the learners to develop a logical set of hypotheses on
the basis of the hypothetical reasoning processes articulated in the reasoning map, and the rele-
vant data and knowledge represented in data table and concept map, respectively. Students’
responses to the open-ended survey supported this explanation. While both groups reported dif-
ficulties in hypothesizing, the experimental group reported that they benefitted more from the
3DTG in visualizing multiple hypotheses and their relations in a big picture. In addition, the dif-
ficulties reported by students in the experimental group involved more aspects of hypothesizing
including exploring multiple hypotheses and the root cause of the problem, showing their
engagement in high-order thinking when exploring hypotheses. Such engagement enabled them
to generate sufficient reasonable hypotheses in their inquiry report. Moreover, the 3DTG
enabled students to review and revise previously formed hypotheses. The presence of rejected
hypotheses meant that students could consider information or data that conflicted with their pre-
viously formed ideas. Conversely, students in the control condition expressed their confusions
over when and how to reject a hypothesis. The finding confirmed the belief that well-designed
external representations assist problem solving by re-ordering and refining the ideas in ways
that are helpful to find the solution (Cox, 1999).
Second, the 3DTG significantly improved the experimental group’s reasoning performance. This
could be caused by the reasoning map in the 3TDG, which focused the students’ attention on the rea-
soning activities (e.g., developing logical reasoning by finding confirming and disconfirming evidence,
thinking in an organized and logical way), thereby making the reasoning salient and easy to watch
(Toth et al., 2002). These views were expressed in the experimental group’s responses to the 3DTG’s
benefits. Research also has indicated that visual representations help learners explicitly clarify their
thinking (Brna, Cox, & Good, 2001).
In particular, the 3DTG helped learners to clearly see the relationships between different pieces of
information. Some students in the experimental condition said that drawing data table helped them
notice easily-ignored data. Looking at the data table enabled the learners to summarize all possible
pieces of evidence for or against a hypothesis without missing anything. This confirms the view that
proper external representations help make information explicit (Cox, 1999). Studies have also indicated
that the diagrams/graphs are quite useful for relating information (Suthers & Hundhausen, 2002). Fur-
ther, based on the explicit information, the learners could easily reflect on the reasoning process, which
was only mentioned in the experimental group’s responses.
More importantly, requiring students to list the confirming and disconfirming evidence might have
stimulated them to search for counterevidence (Janssen et al., 2010). Students have reported that find-
ing counterargument is difficult in scientific reasoning (Kuhn et al., 2000). In this study, the control
group students said concept mapping could not embody detailed information and evidence. In fact,
some of their submitted reports failed to provide enough data or adequately explain the cause-and-
effect relationship. The survey data also indicated that more students in the control group reported diffi-
culty in reasoning and justifying hypotheses.
Although concept maps have the capability to make thinking visible, concept mapping alone is
inadequate in eliciting and representing the process of applying knowledge to solve a problem (Wang
20 |
1258 CHENET
CHEN ET AL.
AL.
et al., 2017). The 3DTG has greater potential to visualize and facilitate learners’ hypothetical reasoning
and argumentation by integrating the concept map, data table, and reasoning map in a holistic way.
The responses to the open-ended survey also showed that 3DTG promoted reasoning globally by mak-
ing visible the required information, underlying concepts, and the reasoning process. All of these might
better explain why the experimental groups received higher scores for reasoning and justification.
Third, with regard to the effects on drawing conclusions, there was no significant difference between
the two conditions. Research has also suggested the effectiveness of concept mapping in facilitating the
convergence of conclusions among group members (Gijlers & de Jong, 2013; Novak, 2002; Suthers
et al., 2008). This indicated that the 3DTG and concept map were equally beneficial to drawing conclu-
sions. Although the experimental groups did not perform significantly better in drawing conclusions,
they got significantly higher scores for generating hypotheses and reasoning. This suggested that students
using the 3DTG developed conclusions on the basis of more systematic thinking and reasoning and suffi-
cient evidence, which is critical to inquiry learning. Hypothesizing and reasoning are fundamental inquiry
skills (Toth et al., 2002). Drawing conclusions grounded in systematic thinking and reasoning has greater
potential than jumping to correct conclusions with inadequate reasoning. Once students learn to formulate
hypotheses and reason, their competence may transfer to other similar scientific activities.
Fourth, with respect to the students’ perceptions of consensus building, the result indicated quite
positive perceptions, despite the nonsignificant difference between the two conditions. This was con-
sistent with the findings of Gijlers et al. (2009), which also indicated that encouraging learners to col-
laboratively build an external representation facilitated consensus building activities. The finding
suggested that the 3DTG can share equal value in some aspects with concept map.
Finally, the better task performance of students in the experimental condition might lead to their
significantly higher scores in the postknowledge test than their counterparts. As students using the
3DTG deeply reflected on the relationships between the domain concepts and events, they had better
understanding of the causal relationships in the ecosystems. Research has also suggested that reasoning
and writing activities help students integrate new information with prior knowledge to develop deep,
contextualized, and applicable knowledge (Keselman, Kaufman, Kramer, & Patel, 2007; Keys, 2000).
Conversely, reasoning ability influences achievement (Lawson et al., 2007); therefore, higher reasoning
ability shown by the experimental group students may also explain their better achievement in the
post-test. In addition, according to the survey results, slightly more students in the control group
reported issues with knowledge.
6 | CONCLUSION
In science learning, students have often had difficulties engaging in fruitful inquiry learning. In addi-
tion to the difficulty involved with regulating the inquiry process, they have encountered problems gen-
erating hypotheses and carrying out scientific reasoning with intertwined variables. Despite the
availability of different kinds of support or guidance (e.g., structuring tasks, using prompts and hints),
learners might still have found it cognitively demanding to successfully complete inquiry and problem-
solving tasks, which have typically combined all kinds of information and data, concepts, and relation-
ships and have involved complex hypothesizing and reasoning processes using the data and knowl-
edge. This study proposed and investigated the effects of the 3DTG which allowed learners to
articulate information, the relevant concepts and their relationships, and the hypothesizing and reason-
ing processes of exploring the problem in a holistic picture to support inquiry learning.
The findings show that the students using the 3DTG achieved significantly better knowledge and
performed significantly better on the group inquiry task, suggesting the benefits of constructing the
3DTG to support inquiry learning and complex problem solving. Specifically, the 3DTG provided
CHEN ET
CHEN AL.
ET AL.
|
1259
21
learners with an overview of the inquiry task, and guided them in generating hypotheses step-by-step,
developing evidence-based reasoning based on relevant data and knowledge, and knowing how and
where to revise previously formed but unreasonable hypotheses or ideas. By incorporating a problem’s
data, subject knowledge, and the hypothesizing and reasoning process in a holistic visual representa-
tion, the proposed approach demonstrated its promising effects in supporting inquiry learning espe-
cially in facilitating the formulation of hypotheses and the reasoning process with complex problems.
Conversely, the 3DTG can share equal value with concept mapping in terms of drawing conclusions
and consensus building.
6.1 | Implications
This study has several implications for designing support or guidance for inquiry learning in computer-
supported environments. First, it is important to scaffold students when they engage in a complex
inquiry task. Visually representing in a holistic picture the concepts of subject knowledge, problem
information and data, and the progressive process of evidence-based reasoning and hypothesizing
based on relevant data and knowledge can facilitate students’ inquiry performance. Externalizing the
key cognitive aspects (e.g., searching for information and data, constructing domain knowledge, and
hypothesizing and reasoning) in a holistic way is critical for successful inquiry learning. Second, the
choice of representational tools should align with the characteristics and demands of the problem or
task (Cox, 1999; Slof, Erkens, Kirschner, & Jaspers, 2010). The 3DTG used in this study has shown
promising effects in externalizing and facilitating complex hypothesizing of causal relations and rea-
soning on the basis of relevant data and subject knowledge. However, this representational facility
might not automatically apply to all inquiry tasks without adaptation. Third, formative assessment of
the learning process plays an important role in inquiry learning. The findings of this study indicate that
students may reach correct conclusions that are not grounded in systematic thinking and reasoning.
Learning the inquiry process is a more important outcome than correctly solving the problem. Teachers
may need to assess the students’ thinking and reasoning processes to understand their inquiry perform-
ance better, find the specific difficulties encountered by them, and make further improvement to the
scaffolding of inquiry learning.
Some issues or limitations exist in the current study. The group consensus building was measured by a
self-reported questionnaire, which might have been insufficient to evaluate consensus building activity.
A more objective way would be to assess it during the learning process on the basis of social interac-
tion or group discussion records, just as Gijlers and de Jong (2013) did or video record group discus-
sion processes. Examining the discussion process might lead to a fuller understanding of the effects of
constructing the 3DTG on group consensus building and drawing conclusions.
To improve the effectiveness of the 3DTG for drawing conclusions, some additional improvements
may be needed. One possible enhancement is to explicitly remind students to critique and revisit their
constructed maps. Research has indicated that combining constructing and critiquing concept maps can
support the integration of ideas (Schwendimann & Linn, 2016). These issues will be addressed in
future work. In addition, more research is needed to verify the effects of the 3DTG. As establishing the
validity of an instrumentation tool is an ongoing process (Lederman, Abd-El-Khalick, Bell, &
Schwartz, 2002), further validity tests of the questionnaire and rubrics for assessing the inquiry report
would be needed.
22 |
1260 CHENET
CHEN ET AL.
AL.
O R CI D
Minhong Wang http://orcid.org/0000-0002-1084-6814
R EFE RE NC ES
Albanese, M. A., & Mitchell, S. (1993). Problem-based learning. A Review of Literature on Its Outcomes and
Implementation Issues. Academic Medicine, 68(1), 52–81.
Bell, T., Urhahne, D., Schanze, S., & Ploetzner, R. (2010). Collaborative inquiry learning: Models, tools, and chal-
lenges. International Journal of Science Education, 32(3), 349–377. https://doi.org/10.1080/09500690802582241
Bransford, J., Brown, A. L., & Cocking, R. E. (1999). How people learn: Brain, mind, experience, and school.
Washington, DC: National Academy Press.
Brna, P., Cox, R., & Good, J. (2001). Learning to think and communicate with diagrams: 14 questions to consider.
Artificial Intelligence Review, 15(1–2), 115–134. https://doi.org/10.1023/A:1006584801959
Chinn, C. A., & Brewer, W. F. (1993). The role of anomalous data in knowledge acquisition: A theoretical frame-
work and implications for science instruction. Review of Educational Research, 63(1), 1–49. https://doi.org/10.
3102/00346543063001001
Cohen, J. (1992). Statistical power analysis. Current Directions in Psychological Science, 1(3), 98–101.
Cox, R. (1999). Representation construction, externalised cognition and individual differences. Learning and
Instruction, 9(4), 343–363.
Davis, E. A. (2003). Prompting middle school science students for productive reflection: Generic and directed
prompts. Journal of the Learning Sciences, 12(1), 91–142.
de Jong, T., & van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual
domains. Review of Educational Research, 68(2), 179–201.
Furtak, E. M., Seidel, T., Iverson, H., & Briggs, D. C. (2012). Experimental and quasi-experimental studies of
inquiry-based science teaching: A meta-analysis. Review of Educational Research, 82(3), 300–329. https://doi.
org/10.3102/0034654312457206
Geier, R., Blumenfeld, P. C., Marx, R. W., Krajcik, J. S., Fishman, B., Soloway, E., & Clay-Chambers, J. (2008).
Standardized test outcomes for students engaged in inquiry-based science curricula in the context of urban
reform. Journal of Research in Science Teaching, 45(8), 922–939. https://doi.org/10.1002/tea.20248
Germann, P. J., & Aram, R. J. (1996). Student performances on the science processes of recording data, analyzing
data, drawing conclusions, and providing evidence. Journal of Research in Science Teaching, 33(7), 773–798.
https://doi.org/10.1002/(SICI)1098-2736(199609)33:7<773::AID-TEA5>3.0.CO;2-K
Gijbels, D., Dochy, F., Van den Bossche, P., & Segers, M. (2005). Effects of problem-based learning: A
meta-analysis from the angle of assessment. Review of Educational Research, 75(1), 27–61.
Gijlers, H., & De Jong, T. (2005). The relation between prior knowledge and students’ collaborative discovery
learning processes. Journal of Research in Science Teaching, 42(3), 264–282. https://doi.org/10.1002/tea.20056
Gijlers, H., & de Jong, T. (2013). Using concept maps to facilitate collaborative simulation-based inquiry learning.
Journal of the Learning Sciences, 22(3), 340–374. https://doi.org/10.1080/10508406.2012.748664
Gijlers, H., Saab, N., Van Joolingen, W. R., De Jong, T., & Van Hout-Wolters, B. H. A. M. (2009). Interaction
between tool and talk: How instruction and tools support consensus building in collaborative inquiry-learning
environments. Journal of Computer Assisted Learning, 25(3), 252–267. https://doi.org/10.1111/j.1365-2729.
2008.00302.x
Grotzer, T. A., & Bell Basca, B. (2003). How does grasping the underlying causal structures of ecosystems impact
students’ understanding? Journal of Biological Education, 38(1), 16–29.
Hmelo-Silver, C. E., Duncan, R. G., & Chinn, C. A. (2007). Scaffolding and achievement in problem-based and
inquiry learning: A response to Kirschner, Sweller, and Clark (2006). Educational Psychologist, 42(2), 99–107.
Hogan, K., & Maglienti, M. (2001). Comparing the epistemological underpinnings of students’ and scientists’ reasoning
about conclusions. Journal of Research in Science Teaching, 38(6), 663–687. https://doi.org/10.1002/tea.1025
CHEN ET
CHEN AL.
ET AL.
|1261
23
Hsu, C. C., Chiu, C. H., Lin, C. H., & Wang, T. I. (2015). Enhancing skill in constructing scientific explanations
using a structured argumentation scaffold in scientific inquiry. Computers and Education, 91, 46–59. https://doi.
org/10.1016/j.compedu.2015.09.009
Janssen, J., Erkens, G., Kirschner, P. A., & Kanselaar, G. (2010). Effects of representational guidance during computer-
supported collaborative learning. Instructional Science, 38(1), 59–88. https://doi.org/10.1007/s11251-008-9078-1
Jonassen, D. H. (1997). Instructional design models for well-structured and III-structured problem-solving learning
outcomes. Educational Technology Research and Development, 45(1), 65–94.
Kamarainen, A. M., Metcalf, S., Grotzer, T., & Dede, C. (2015). Exploring ecosystems from the inside: How
immersive multi-user virtual environments can support development of epistemologically grounded modeling
practices in ecosystem science instruction. Journal of Science Education and Technology, 24(2–3), 148–167.
https://doi.org/10.1007/s10956-014-9531-7
Keselman, A., Kaufman, D. R., Kramer, S., & Patel, V. L. (2007). Fostering conceptual change and critical reasoning
about HIV and AIDS. Journal of Research in Science Teaching, 44(6), 844–863. https://doi.org/10.1002/tea.20173
Keys, C. W. (2000). Investigating the thinking processes of eighth grade writers during the composition of a
scientific laboratory report. Journal of Research in Science Teaching, 37(7), 676–690.
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An
analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching.
Educational Psychologist, 41(2), 75–86. https://doi.org/10.1207/s15326985ep4102_1
Klahr, D., & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive Science, 12(1), 1–48.
https://doi.org/10.1016/0364-0213(88)90007-9
Kollar, I., Fischer, F., & Slotta, J. D. (2007). Internal and external scripts in computer-supported collaborative
inquiry learning. Learning and Instruction, 17(6), 708–721. https://doi.org/10.1016/j.learninstruc.2007.09.021
Krathwohl, D. R. (1998). Methods of educational and social science research: An integrated approach (2nd ed.).
New York: Longman.
Kuhn, D., Black, J., Keselman, A., & Kaplan, D. (2000). The development of cognitive skills to support inquiry
learning. Cognition and Instruction, 18(4), 495–523.
Lawson, A. E. (2005). What is the role of induction and deduction in reasoning and scientific inquiry? Journal of
Research in Science Teaching, 42(6), 716–740. https://doi.org/10.1002/tea.20067
Lawson, A. E., Banks, D. L., & Logvin, M. (2007). Self-efficacy, reasoning ability, and achievement in college
biology. Journal of Research in Science Teaching, 44(5), 706–724. https://doi.org/10.1002/tea.20172
Lazonder, A. W., & Harmsen, R. (2016). Meta-analysis of inquiry-based learning: Effects of guidance. Review of
Educational Research, 86(3), 681–718. https://doi.org/10.3102/0034654315627366
Lederman, N. G., Abd-El-Khalick, F., Bell, R. L., & Schwartz, R. S. (2002). Views of nature of science question-
naire: Toward valid and meaningful assessment of learners’ conceptions of nature of science. Journal of
Research in Science Teaching, 39(6), 497–521. https://doi.org/10.1002/tea.10034
Linn, M. C., & Slotta, J. D. (2000). WISE science. Educational Leadership, 58(2), 29–32.
Linn, M. C., & Songer, N. B. (1991). Teaching thermodynamics to middle school students: What are appropriate
cognitive demands? Journal of Research in Science Teaching, 28(10), 885–918. https://doi.org/10.1002/tea.
3660281003
Markow, P. G., & Lonning, R. A. (1998). Usefulness of concept maps in college chemistry laboratories: Students’
perceptions and effects on achievement. Journal of Research in Science Teaching, 35(9), 1015–1029.
Njoo, M., & De Jong, T. (1993a). Exploratory learning with a computer simulation for control theory: Learning
processes and instructional support. Journal of Research in Science Teaching, 30(8), 821–844. https://doi.org/10.
1002/tea.3660300803
Njoo, M., & de Jong, T. (1993b). Supporting exploratory learning by offering structured overviews of hypotheses.
In T. d. J. D. Towne & H. Spada (Eds.), Simulation-based experiential learning (pp. 207–225). Berlin, Germany:
Springer-Verlag.
24 |
1262 CHENET
CHEN ET AL.
AL.
Noroozi, O., Teasley, S. D., Biemans, H. J. A., Weinberger, A., & Mulder, M. (2013). Facilitating learning in multi-
disciplinary groups with transactive CSCL scripts. International Journal of Computer-Supported Collaborative
Learning, 8(2), 189–223. https://doi.org/10.1007/s11412-012-9162-z
Novak, J. D. (2002). Meaningful learning: The essential factor for conceptual change in limited or inappropriate
propositional hierarchies leading to empowerment of learners. Science Education, 86(4), 548–571.
Novak, J. D., Bob Gowin, D., & Johansen, G. T. (1983). The use of concept mapping and knowledge vee mapping with
junior high school science students. Science Education, 67(5), 625–645. https://doi.org/10.1002/sce.3730670511
Pedaste, M., de Jong, T., Sarapuu, T., Piks€o€ot, J., van Joolingen, W. R., & Giemza, A. (2013). Investigating ecosys-
tems as a blended learning experience. Science, 340(6140), 1537–1538.
Penner, D. E. (2000). Explaining systems: Investigating middle school students’ understanding of emergent phenom-
ena. Journal of Research in Science Teaching, 37(8), 784–806.
Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., . . . Soloway, E. (2004). A scaffolding
design framework for software to support science inquiry. Journal of the Learning Sciences, 13(3), 337–386.
Roth, W. M., & McGinn, M. K. (1998). Inscriptions: Toward a theory of representing as social practice. Review of
Educational Research, 68(1), 35–59.
Roth, W. M., & Roychoudhury, A. (1993). The concept map as a tool for the collaborative construction of
knowledge: A microanalysis of high school physics students. Journal of Research in Science Teaching, 30(5),
503–534. https://doi.org/10.1002/tea.3660300508
Ruiz-Primo, M. A., Schultz, S. E., Li, M., & Shavelson, R. J. (2001). Comparison of the reliability and validity of
scores from two concept-mapping techniques. Journal of Research in Science Teaching, 38(2), 260–278.
Ruiz-Primo, M. A., & Shavelson, R. J. (1996). Problems and issues in the use of concept maps in science assess-
ment. Journal of Research in Science Teaching, 33(6), 569–600.
Sandoval, W. A. (2003). Conceptual and epistemic aspects of students’ scientific explanations. Journal of the
Learning Sciences, 12(1), 5–51.
Schwendimann, B. A., & Linn, M. C. (2016). Comparing two forms of concept map critique activities to facilitate
knowledge integration processes in evolution education. Journal of Research in Science Teaching, 53(1), 70–94.
https://doi.org/10.1002/tea.21244
Shin, N., Jonassen, D. H., & McGee, S. (2003). Predictors of well-structured and ill-structured problem solving in
an astronomy simulation. Journal of Research in Science Teaching, 40(1), 6–33.
Shute, V. J., & Glaser, R. (1990). A large-scale evaluation of an intelligent discovery world: Smithtown. Interactive
Learning Environments, 1(1), 51–77.
Singer, J., Marx, R. W., Krajcik, J., & Clay Chambers, J. (2000). Constructing extended inquiry projects:
Curriculum materials for science education reform. Educational Psychologist, 35(3), 165–178.
Slof, B., Erkens, G., Kirschner, P. A., & Helms-Lorenz, M. (2013). The effects of inspecting and constructing part-
task-specific visualizations on team and individual learning. Computers and Education, 60(1), 221–233.
https://doi.org/10.1016/j.compedu.2012.07.019
Slof, B., Erkens, G., Kirschner, P. A., Janssen, J., & Jaspers, J. G. M. (2012). Successfully carrying out complex
learning-tasks through guiding teams’ qualitative and quantitative reasoning. Instructional Science, 40(3),
623–643. https://doi.org/10.1007/s11251-011-9185-2
Slof, B., Erkens, G., Kirschner, P. A., & Jaspers, J. G. M. (2010). Design and effects of representational scripting
on group performance. Educational Technology Research and Development, 58(5), 589–608. https://doi.org/10.
1007/s11423-010-9148-3
Stoyanov, S., & Kommers, P. (2006). WWW-intensive concept mapping for metacognition in solving ill-structured
problems. International Journal of Continuing Engineering Education and Life Long Learning, 16(3),
297–316.
Stratford, S. J., Krajcik, J., & Soloway, E. (1998). Secondary students’ dynamic modeling processes: Analyzing, rea-
soning about, synthesizing, and testing models of stream ecosystems. Journal of Science Education and Technol-
ogy, 7(3), 215–234.
CHEN ET
CHEN AL.
ET AL.
|1263
25
Suthers, D. D., & Hundhausen, C. D. (2002). The effects of representation on students’ elaborations in collabora-
tive inquiry. In Proceedings of the Conference on Computer Support for Collaborative Learning: Foundations for
a CSCL Community (pp. 472–480). Boulder, Colorado (USA). Hillsdale, NJ: Lawrence Erlbaum Associates.
Suthers, D. D., & Hundhausen, C. D. (2003). An experimental study of the effects of representational guidance on
collaborative learning processes. Journal of the Learning Sciences, 12(2), 183–218.
Suthers, D. D., Vatrapu, R., Medina, R., Joseph, S., & Dwyer, N. (2008). Beyond threaded discussion: Representa-
tional guidance in asynchronous collaborative learning environments. Computers and Education, 50(4),
1103–1127. https://doi.org/10.1016/j.compedu.2006.10.007
Toth, E. E., Suthers, D. D., & Lesgold, A. M. (2002). “Mapping to Know”: The Effects of Representational Guid-
ance and Reflective Assessment on Scientific Inquiry. Science Education, 86(2), 264–286. https://doi.org/10.
1002/sce.10004
Van Bruggen, J. M., Kirschner, P. A., & Jochems, W. (2002). External representation of argumentation in CSCL
and the management of cognitive load. Learning and Instruction, 12(1), 121–138. https://doi.org/10.1016/S0959-
4752(01)00019-6
Van Joolingen, W. R., & De Jong, T. (1997). An extended dual search space model of scientific discovery learning.
Instructional Science, 25(5), 307–346.
van Joolingen, W. R., Savelsbergh, E. R., de Jong, T., Lazonder, A. W., & Manlove, S. (2005). Co-Lab: Research
and development of an online learning environment for collaborative scientific discovery learning. Computers in
Human Behavior, 21(4), 671–688. https://doi.org/10.1016/j.chb.2004.10.039
Wang, M., Cheng, B., Chen, J., Mercer, N., & Kirschner, P. A. (2017). The use of web-based collaborative concept
mapping to support group learning and interaction in an online environment. Internet and Higher Education, 34,
28–40. https://doi.org/10.1016/j.iheduc.2017.04.003
Wang, M., Kirschner, P. A., & Bridges, S. M. (2016). Computer-based learning environments for deep learning in
inquiry and problem-solving contexts. In Proceedings of the 12th International Conference of the Learning Scien-
ces (pp. 1356-1360). Singapore: International Society of the Learning Sciences.
Wang, M., Wu, B., Kinshuk, Chen, N.- S., & Spector, J. M. (2013). Connecting problem-solving and knowledge-
construction processes in a visualization-based learning environment. Computers and Education, 68, 293–306.
https://doi.org/10.1016/j.compedu.2013.05.004
Wecker, C., Rachel, A., Heran-D€orr, E., Waltner, C., Wiesner, H., & Fischer, F. (2013). Presenting theoretical ideas
prior to inquiry activities fosters theory-level knowledge. Journal of Research in Science Teaching, 50(10),
1180–1206. https://doi.org/10.1002/tea.21106
Weinberger, A., Stegmann, K., & Fischer, F. (2007). Knowledge convergence in collaborative learning: Concepts
and assessment. Learning and Instruction, 17(4), 416–426. https://doi.org/10.1016/j.learninstruc.2007.03.007
White, B. Y. (1993). ThinkerTools: Causal models, conceptual change, and science education. Cognition and
Instruction, 10(1), 1–100.
Windschitl, M. (2004). Folk theories of “inquiry:” How preservice teachers reproduce the discourse and practices of
an atheoretical scientific method. Journal of Research in Science Teaching, 41(5), 481–512.
Zeineddin, A., & Abd-El-Khalick, F. (2010). Scientific reasoning and epistemological commitments: Coordination
of theory and evidence among college science students. Journal of Research in Science Teaching, 47(9), 1064–
1093. https://doi.org/10.1002/tea.20368
Zimmerman, C., Raghavan, K., & Sartoris, M. L. (2003). The impact of the MARS curriculum on students’ ability
to coordinate theory and evidence. International Journal of Science Education, 25(10), 1247–1271. https://doi.
org/10.1080/0950069022000038303
How to cite this article: Chen J, Wang M, Grotzer TA, Dede C. Using a three-dimensional
thinking graph
graph totosupport
supportinquiry learning.
inquiry J Res
learning. Sci Sci
J Res Teach. 2018;55:1239–1263.
Teach. https://doi.org/10.
2018;00:1–25. https://doi.org/10.
1002/tea.21450
1002/tea.21450