You are on page 1of 25

Educational Technology Research and Development (2018) 66:1479–1503

https://doi.org/10.1007/s11423-018-9617-7
(0123456789().,-volV)(0123456789().,-volV)

DEVELOPMENT ARTICLE

Effects of graphic organizers in online discussions:


comparison between instructor-provided and student-
generated

Kyungbin Kwon1 • Suhkyung Shin2 • Su Jin Park1


Published online: 20 July 2018
Ó Association for Educational Communications and Technology 2018

Abstract
The current experimental study examined the effects of graphic organizers in a collabo-
rative learning context where students constructed knowledge during online discussions.
As the results could vary depending on how students interacted with the graphic orga-
nizers, this study compared two different approaches: instructor-provided versus student-
generated graphic organizers. Thus, the purpose of this study was to investigate the effects
of receiving or generating graphic organizers on students’ engagement in online discus-
sions. Thirty-six graduate students enrolled in an online graduate course participated in the
study. While analyzing an instructional design case, students were asked to discuss design
issues in a randomly assigned group. There were three conditions: control condition
without graphic organizers, instructor-provided, and student-generated graphic organizers.
Major findings revealed that both generating and receiving graphic organizers facilitated
students’ higher levels of cognitive engagement, and encouraged students to consider
alternative views during the discussions. Without the graphic organizer, students tended to
simply summarize previous messages or raise new issues rather than elaborating on pre-
vious topics. There was a significant finding regarding the ways of interacting graphic
organizers. Students discussed more topics when they were given instructor’s graphic
organizers rather than when they were asked to generate them.

Keywords Online discussion  Graphic organizer  Quality discussion  Activity theory 


Cognitive load

& Kyungbin Kwon


kwonkyu@indiana.edu
Suhkyung Shin
suhkshin1@uos.ac.kr
Su Jin Park
park268@indiana.edu
1
Indiana University, 201 N. Rose Ave, Bloomington, IN 47405, USA
2
University of Seoul, 163 Seoulsiripdaero, Dongdaemun-gu, Seoul 02504, Korea

123
1480 K. Kwon et al.

Introduction

Since early 1960, when Ausubel (1960) introduced advance organizers, graphic organizers
have been studied in text comprehension and scientific knowledge acquisition (e.g., Larkin
and Simon 1987; Robinson and Kiewra 1995). Graphic organizers have been utilized in
various forms such as matrices (Kiewra et al. 1999), concept maps (Novak 1990),
knowledge maps (O’Donnell et al. 2002), and types of diagrams (Butcher 2006; Robinson
and Skinner 1996).
Researchers have claimed that graphic organizers have positive effects on information
processing and learning such as reducing cognitive load (Stull and Mayer 2007), attaining
relational knowledge (DiCecco and Gleason 2002), recognizing the structures of data
(Keller et al. 2006), and enhancing metacognitive strategies (Novak 1990). Graphic
organizers are particularly beneficial in complex problem situations because they support
student’s reasoning process such as identifying critical data, generating hypotheses, and
justifying the hypotheses (Wu et al. 2016).
As stated in Larkin and Simon (1987), graphic organizers spatially display relationships
among concepts, which is known to improve computational efficiency for searching and
recognizing relevant information, and drawing inferences from the searched information.
An expository text merely delivers a concept in words that does not explicitly represent
relationships of elements, and thus requires readers’ own efforts to understand the rela-
tionships of elements. However, a graphic organizer itself already exhibits the conceptual
organization by spatial arrangements of elements (Stull and Mayer 2007).
Using a graphic organizer in online discussions, therefore, can be advantageous in
online discussions where complex problem situations are presented. Suppose students
engage in a discussion in a typical online threaded forum. The forum displays each
message in a separate post and arranges posts as an initial post and its responses. The
arrangement explicitly indicates the chronological order of posts but not necessarily the
relation of contents. In this context, students need to review individual posts and mentally
organize the relations of posts during the discussion, which, as a result, requires higher
cognitive demands (Larkin and Simon 1987). Considering that there are multiple threads in
common online discussions, the cognitive demands will get higher as the discussion
evolves. In that context, students’ limited cognitive capacity may prevent the students from
viewing the whole posts holistically, and thus, students tend to focus only on several posts
with easy access (Hewitt 2005; Murphy and Coleman 2004).
A graphic organizer, when integrated into the online discussion properly, can resolve
the matter and facilitate better meaning-making (Jyothi et al. 2012). A graphic organizer
visualizes selected issues and their relations so that students can efficiently grab the
relations of arguments, the contrasts of issues, and the flow of discussion (Kwon and Park
2017; Novak 1990; Nussbaum and Schraw 2007). It also reduces the demands of coor-
dination during group discussions by enhancing the ‘‘shared focus’’ among students
(Eryilmaz et al. 2013). Considering that discussions are a collaborative learning activity,
establishing a shared common ground is essential for co-construction of meanings. When
students use an external representation tool ‘‘anchoring’’ their discussion, they can
decrease the necessity of mutual effort to coordinate social interactions during a discussion
(Eryilmaz et al. 2013).
In spite of these advantages, however, utilizing graphic organizers for online discus-
sions is not common in teaching practice. There are a few pioneers who have adopted
graphic organizers to facilitate better online discussion in the ways of presenting

123
Graphic organizers in online discussions 1481

alternative perspectives (Nussbaum et al. 2007), providing evidence for arguments (Suthers
and Hundhausen 2003), and representing a position on arguments such as ‘‘supports’’ or
‘‘opposes’’ (Munneke et al. 2003). Although the previous studies have provided favorable
results of a variety of tools, integrating the tools into a forum embedded in the traditional
Learning Management Systems has been found to be inconvenient. For this reason, the
current study adopted a commonly accessible web 2.0 tool as an attempt to examine
whether incorporating graphic organizers created through the web 2.0 tool would enhance
the quality of discussion.
We also considered who created the graphic organizers: student-generated and
instructor-provided could result in different performances. Stull and Mayer (2007), for
example, suggest that students outperformed in studying scientific passages when they
received graphic organizers created by experts rather than when they were asked to con-
struct them. Although students could engage in generative learning activities while con-
structing graphic organizers, the task turned out to be an extraneous process that hindered
an essential learning process. The result was explained based on the cognitive load theory
contextualized in individual learning. Because of the scarcity of the studies regarding the
different approaches in online collaborative learning contexts, it was questionable whether
the result and explanation of Stull and Mayer (2007) would apply to the online discussion
where students’ collaborative meaning construction is a key to learning. Thus, the current
study further examined the effects of the two approaches toward graphic organizers in
online discussion as well.

Theoretical framework

Quality of online discourses

A prevalent premise of student learning, from both cognitive and social constructivists’
perspectives (Jonassen et al. 1995; Lazonder et al. 2003), is that collaboration while
engaging in an online discussion leads to knowledge construction. What should be
examined next is the quality of the knowledge constructed through these online discus-
sions. Due to the disposition of online discourses being transparent and exhibiting students’
knowledge construction processes, the content analysis of discourses has been adopted to
evaluate the process of knowledge building and further the quality of built knowledge in
previous research (refer to De Wever et al. 2006).
Although researchers’ tools for assessing online discourses vary, it is widely agreed that
a hierarchy exists in knowledge construction irrespective of the theoretical frameworks on
which each study is grounded. The three most frequently used frameworks found in the
literature are (1) critical thinking (Beckmann and Weber 2016; Henri 1992; Newman et al.
1995), (2) knowledge construction (Gunawardena et al. 1997; Heo et al. 2010; Pena-Shaff
and Nicholls 2004; Weinberger and Fischer 2006), and (3) Community of Inquiry (An-
derson et al. 2001; Garrison et al. 2001; Rourke et al. 2007). Depending on the theoretical
grounds, scholars have used different terms for the analysis schemes such as levels of
critical thinking, phases of knowledge construction, or phases of practical inquiry. All in
all, the consensus is that the higher the phase that online discourse falls into, the higher the
quality of meaning-making that occurs.
Grounded in the previous studies, the present study proposes three levels of knowledge
construction—initiate, develop, and construct—in order to understand students’ individual
thinking processes as well as social knowledge construction presented in online

123
1482 K. Kwon et al.

discussions. In other words, students co-construct shared knowledge while initiating ideas,
developing the ongoing ideas, and further constructing the developed ideas. Table 1
describes the coding schemes that were developed based on the previous studies (Garrison
et al. 2001; Gunawardena et al. 1997; Järvelä and Häkkinen 2002; Veerman and Veldhuis-
Diermanse 2001).

Initiate

The first level of knowledge construction in the present study is initiation. Garrison et al.
(2001) indicated triggering events as the first phase of their practical inquiry model (PI
model). They particularly emphasized the importance of proper teaching presence that
‘‘initiates and shapes’’ the beginning of discussions. Under the circumstances where
teaching presence is lesser, students take initiative while sharing or comparing surface
level information, according to Gunawardena et al. (1997). In such cases, students can
bring up or restate new ideas made up of facts, experiences, opinions, or theories, as stated
in Veerman and Veldhuis-Diermanse (2001). This early stage is called lower-level dis-
cussions as in Järvelä and Häkkinen (2002) where students present their opinions or
comments while barely taking previous posts into consideration.

Develop

The next level is the exploration of new or restated ideas, as in Garrison et al.’s (2001) PI
model. They explicated that this phase does not just deal with individual exploration. There
are back-and-forth shifts between individual and social exploration to develop the ideas
that have been stated. Students explain their own or others’ ideas while citing, giving
examples, articulating, or developing (Veerman and Veldhuis-Diermanse 2001), and thus
progressive dynamics in conversations are shown as stated in Järvelä and Häkkinen (2002).
In this level, it is crucial to check if students give alternative views beyond elaborating the
ideas in agreement. Gunawardena et al. (1997) referred to this as the second phase—
discovery/exploration of dissonance/inconsistency. Nussbaum and Schraw (2007) under-
scored the presentation of counterarguments as one of the essential keys to developing
critical thinking. Acknowledging counterarguments and standing against them requires

Table 1 Comparison of coding schemes


Garrison Gunawardena et al. Veerman and Järvelä and Häkkinen
et al. (1997) Veldhuis- (2002)
(2001) Diermanse (2001)

Initiate (Bring Triggering Sharing/comparing of New ideas Separate comments/


up/restate) event information opinions
Develop Exploration Discovery/exploration Explanation Progressive knowledge
(Elaborate/ of dissonance/ building during dynamic
alternative) inconsistency conversations
Construct Integration/ Negotiation of Evaluation Mutual negotiation
(Evaluate/ resolution meaning/co-
synthesize/ construction
reflect)

123
Graphic organizers in online discussions 1483

students to think more deeply; not only must they comprehend their own arguments, but
also analyze counterarguments.

Construct

This is the highly valued yet not-easy-to-reach phase in terms of learning that occurs
online. Garrison et al. (2001) stressed their third and fourth phases, integration and reso-
lution, respectively, which were interpreted as mutual negotiation in Järvelä and Häkki-
nen’s (2002) study. In the integration phase, students construct meanings from the
elaborated ideas that may or may not be influenced by alternative points of view while
evaluating, synthesizing, or reflecting on both their own and others’ ideas. Following this
phase, students can go through the resolution phase in which they try to apply a newly
proposed solution or test a hypothesis built in the previous phrases. Garrison et al. (2001)
acknowledged that the resolution phase may hardly occur without instructors’ specific
requirements asking students to use the previously acquired knowledge to move onto the
next problems to be solved. The integration phase corresponds to phase 3 and the reso-
lution phase to phases 4 and 5 in Gunawardena et al.’s (1997) model. Gunawardena et al.
(1997) explained that students negotiate and co-construct meanings in phase 3. Phases 4
and 5, in which students are assumed to test ongoing-constructions and apply newly
constructed meanings, have been questioned in regard to accountability by Schellens and
Valcke (2005), who argued that it would not be easy for students to test newly constructed
knowledge unless they were specifically asked to do so.

Quality issues in online discussion

As aforementioned, instructors aim for higher levels of knowledge construction as much as


lower ones in online discussions. Nonetheless, previous studies (Gunawardena et al. 1997;
Hara et al. 2000; Heckman and Annabi 2005; McLoughlin and Luca 2000) have shown that
students’ messages do not exceed a certain level of critical thinking: the majority of
messages present superficial ideas rather than being fully elaborated and/or argued against
in the process of meaning negotiation. Celentin (2007), Schellens and Valcke (2005), and
Vaughan and Garrison (2005) reassured previous studies that the majority of messages
were posted in the initial phases, particularly in the exploration phase.
Attaining a satisfactory quality of online discussion, thus calls for specific interventions
by instructors and instructional strategies (Rovai 2007). When the ultimate goal of dis-
cussion is the integration of knowledge, a high level of cognitive engagement is incumbent
for understanding the relationships between concepts, referring to concepts to explain
phenomena, and restructuring prior knowledge in a coherent way (Alavi and Tiwana 2002;
Davis 2000; Thomas 2002). The following section discusses previous studies of attempts to
improve the quality of online discourse using graphic organizers.

Graphic organizers for facilitating online discussion

In order to successfully engage students in online discussions, attention must be given to


the cognitive challenges that they may face in online learning environments. Collaborative
online discussions place a high demand on learners’ cognitive skills. Students need to
manage complex processes such as composing and posting original analyses with evi-
dence, as well as providing elaborations, questions, evaluations, and critiques of other posts

123
1484 K. Kwon et al.

(Murphy and Coleman 2004; Rovai 2007). Previous studies have explored a variety of
tactics utilizing graphic organizers in online discussions to reduce learners’ cognitive
burden (Jyothi et al. 2012). For example, a descriptive image of participation was provided
to show students’ active and inactive involvement in group discussions on a bulletin board
(Erickson et al. 2002; Figueira and Laranjeiro 2007; Kwon and Park 2017), and a dis-
cussion map was utilized to present the development of conversations and group inter-
action patterns within an online forum (Frey et al. 2006; Gibbs et al. 2006; Jyothi et al.
2012; Kwon and Park 2017).
Although there are numerous instructional strategies to scaffold learners’ online dis-
cussion using graphic organizers, two specific approaches are identified from cognitive
theory and activity theory: an emphasis on reducing learners’ cognitive load by providing
graphic organizers and learners’ active engagement in a cognitive process by creating
graphic organizers (Stull and Mayer 2007). From this perspective, the current study dis-
tinguishes between instructor-provided and student-generated graphic organizers.
According to cognitive theory, providing learners with graphic organizers can benefit
them by reducing extraneous processing and increasing their germane processing, which
allows them to engage in the greater cognitive processing of learning contents (Chandler
and Sweller 1991; Mayer 2004; Mayer and Moreno 2003). Several studies have shown that
instructor-provided graphic organizers influence learners’ comprehension (Schwamborn
et al. 2011; Seufert 2003; Stull and Mayer 2007). Schwamborn et al. (2011) examined the
effect on text comprehension for learning with instructor-provided versus student-gener-
ated graphic organizers in online learning environments. The authors found that the stu-
dents learning with instructor-provided graphic organizers gained higher scores on the
comprehension test than the group with student-generated graphic organizers. They noted
that instructor-provided graphic organizers might promote learners’ active generative
processes and reduce extraneous cognitive processes; as a result, they showed better
performance on content mastery and comprehension. In contrast, the process of creating
graphic organizers caused a cognitive burden resulting in less comprehension. Thus, it can
be expected that instructor-provided graphic organizers will help reduce the learners’
cognitive load and foster appropriate cognitive processing. This should help students to
better understand their peers’ discussion postings and efficiently engage them in collab-
orative group discussions in an online forum.
On the other hand, activity theory emphasizes students’ active engagement while cre-
ating graphic organizers, which serves to facilitate students’ deeper learning and better
connections with existing knowledge (Kiewra et al. 1999). Several researchers have also
argued that learners’ drawing process involves the transformation their understanding of
text-based information into a graphic representation, which helps learners increase their
text comprehension and fosters metacognitive processes such as self-monitoring (Blunt and
Karpicke 2014; Leopold and Leutner 2012; Schmeck et al. 2014; Van Meter and Garner
2005). As such, student-generated graphic organizers may promote the students’ under-
standing of discussion topics as well as help them in analyzing and monitoring group
discussions. Nonetheless, findings from other studies cast doubt on expected benefits of
learning with text and student-generated graphic organizers due to the high cognitive
demands of the integrating process (Leutner et al. 2009; Schwamborn et al. 2011). The
complexity involved in generating graphic organizers may create an extraneous cognitive
load, which may inhibit knowledge processes including evaluating arguments and con-
structing knowledge. Thus, further research needs to be conducted to determine whether
student-generated graphic organizers might promote students’ engagement of online dis-
cussion topics and recall of related knowledge.

123
Graphic organizers in online discussions 1485

In sum, although the use of graphic organizers may differ, few studies have examined
which approach is more effective in engaging learners in collaborative online discussions.
On one hand, when given instructor-provided graphic organizers, it may be expected that
learners effectively select relevant information and organize their thoughts logically during
group discussions in online learning environments. On the other hand, student-generated
graphic organizers may be more effective in complex online discussion environments by
enhancing their metacognitive abilities. Since graphic organizers may support learners’
cognitive and metacognitive skills in various ways, it is meaningful to explore their effects
in actual online discussion contexts.

Purpose of the current study

In the present study, we investigated the effects of graphic organizers that were utilized in
online discussions. The effects were identified in the comparison of a control group that
participated in discussions without the graphic organizers. We also examined any potential
differences in the effects depending on the ways to utilize the graphic organizers. As
discussed, receiving versus generating graphic organizers during discussion might bring
different learning performances. Thus, the current study compared the effects of two
different approaches regarding the level of knowledge construction. This study provides
insight into how learners engage in collaborative online discussions upon either generating
or receiving graphic organizers. The following research questions were addressed:
1. Do students demonstrate the higher level of knowledge construction in online
discussions when receiving or generating graphic organizers?
2. What are the different effects of the two approaches, receiving versus generating
graphic organizers, on the level of knowledge construction?

Method

Participants and learning context

The participants were 36 graduate students enrolled in a 12-week online instructional


design course at a university in the United States. They were recruited from two online
classes of the same course yet taught by the same instructor. This course was chosen
because of its required online activities that demanded a number of collaborative group
discussions, aiming to provide students with the opportunity to develop an understanding
of concepts and knowledge in instructional design through their analysis of authentic cases.
As part of the regular course activities, students were required to analyze an instruc-
tional design (ID) case which was selected from the ID case book (Ertmer et al. 2013), and
engage in group discussions to discuss their case analysis over five nonconsecutive weeks.
A total of five ID cases were provided with leading questions that stimulated group
discussions but did not limit the topics. Among the five discussion forums, two discussion
forums of the second (Case A) and third (Case B) ID cases were selected and included in
the analysis for this study. Case A presented a number of instructional issues related to
learner motivation and engagement while instructional designers sought to convert face-to-
face learning activities to an online format. Specifically, students were asked to discuss the
instructional strategies to achieve not only conceptual changes but also behavioral changes.
Case B dealt with problems that a novice instructional designer faced while conducting a

123
1486 K. Kwon et al.

task analysis to develop training in a manufacturing setting. Students were required to


identify the most effective way to conduct the task analysis in the workplace context where
several limitations were presented. Participation in the discussions was obligatory and
worth 20% of the final score. The group discussion occurred in an online forum embedded
in Canvas that provided discussion threads in rich text editor (RTE) mode. Students only
had access to their own group discussion forum and were not allowed to review those of
other groups.

Research design and procedure

This research was designed as a quasi-experimental study aiming to investigate the effect
of the graphic organizer on the quality of group discussion. The independent variable is the
type of graphic organizers and the dependent variables are the level of the knowledge
construction, types of discussion, and the scope and quantitative depth of discussion. The
discussion groups were assigned to one of the three conditions: three groups in control, two
in instructor-provided, and two in student-generated graphic organizer condition, respec-
tively. The lack of randomization in this study could cause a selection bias and concerns
regarding group equivalence (Suresh 2011). To improve the validity of the research design,
we employed (1) a control group to compare the effectiveness among three conditions, and
(2) a switching-replications quasi-experimental design of two groups as Fig. 3 illustrates
(Cable 2001).
In the first week, students in the treatment groups had prior training and were capable of
utilizing Popplet to draw graphic organizers. Thus, the first week’s discussion was
excluded from the data source. For the ‘‘Case A’’, the control groups were engaged in the
group discussion without the Popplet tool, whereas the treatment groups discussed the case
under two different approaches (two receiving graphic organizer from the instructor versus
the other two generating graphic organizer themselves). For the ‘‘Case B’’, the treatment
groups were switched to the other condition. That is, two discussion groups that had
received the instructor-generated discussion maps for Case A were asked to create maps
for Case B while the other two groups did it in reverse. No control group was assigned for
Case B (see Fig. 1).

Fig. 1 Grouping and


instructional interventions

123
Graphic organizers in online discussions 1487

Web 2.0 tool for the graphic organizer

A web application, Popplet (http://popplet.com/) was used for this study to generate gra-
phic organizers that illustrated discussion topics and their expansion. The tool helped
students organize information not only to promote idea visualizations, but also to help
them make connections between each other’s main ideas and evidence. The graphic
organizers consisted of nodes representing topics and links between the nodes that
described the connections of the topics.

Instructor-provided graphic organizers

In the instructor-provided condition, the instructor generated and updated graphic orga-
nizers using Popplet for each group daily (see Fig. 2). The instructor summarized the main
contents of the messages along with their discussion topics in a node, and made links
between the nodes that had related topics. The instructor-provided graphic organizer was
published and the link was embedded in the forums so that the students under the
instructor-provided condition could review them while participating in their discussions.

Student-generated graphic organizers

In the student-generated condition, students created and updated graphic organizers using
Popplet on their own (see Fig. 3). The instructor guided the students to create a node that
included at least one discussion topic and link it to a related node so the graphic organizer
illustrated topics and their relations. Each group’s Popplet page was published and the link
was inserted in the forum. The instructor registered individual students to each Popplet
group page, so they could visit and created nodes and links in their designated Popplet
group page using one’s own ID. They were allowed to create and edit their nodes and links,
and view their graphic organizer being updated simultaneously.

Fig. 2 An example of the graphic organizers provided by the instructor

123
1488 K. Kwon et al.

Fig. 3 An example of the graphic organizers generated by the students

Measurements

Students’ postings in the online forum and their graphic organizers were collected and
analyzed utilizing a content analysis method as follows.

Level of knowledge construction

A content analysis was conducted to evaluate the levels of knowledge construction and the
types of discussion. We followed Chi (1997) and Graneheim and Lundman (2004)’s
guidelines to develop a coding scheme and analyze the quality of discourses.
Coding scheme development. We developed a tentative coding scheme based on the
literature review as discussed in the theoretical framework. We independently conducted
an initial analysis of the messages with the coding scheme using the samples of groups. To
refine the coding scheme, we discussed the results and revised it until reaching a
consensus.
The primary purpose of the coding scheme was to evaluate the levels of knowledge
construction in the discussion. The developed coding scheme categorized messages into
the three levels: initiate, develop and construct. The initiate refers to the initial level of
discussion that brought issues to discuss. In the next level, students developed their ideas
by adding evidence, validating arguments, or providing alternative perspectives. Finally,
they constructed shared knowledge through evaluating arguments, synthesizing different
opinions, and reflecting on their discussion. Seven types of discussion were identified and
aligned with the level of knowledge construction as Table 2 illustrates. The initiate level

123
Graphic organizers in online discussions 1489

Table 2 Coding scheme of the level of knowledge construction


Category Indicator Description

Initiate Bring up Initiate discussion by raising a new issue


Restate Restate a topic discussed previously without meaningful addition or elaboration
Develop Elaborate Develop an issue by adding details, evidence, arguments in consistent with
previous aspects
Alternative Develop an issue by suggesting alternative or opposing aspects
Construct Evaluate Judge the quality of the arguments discussed
Synthesize Consider alternative aspects and make a point by combining ideas
Reflect Express that one learns something through discussion involving multiple
perspectives

includes two types of discussion: bring up and restate, differentiating whether a student
raised a new issue or reaffirmed a topic discussed previously. Two discussion types related
to the develop level are elaborate and alternative. This refers to developing ideas by adding
additional information or facts, or to develop an issue by suggesting alternative aspects.
Three discussion types are identified as representing the construct level: evaluate, syn-
thesize, and reflect.
To examine the relations of messages in sequence, we also classified messages into
initial messages and replies. An initial message was posted first and it mainly included an
individual’s analysis of the ID cases. A reply was a response to the initial message or other
replies and it included questions to a peer, feedback on others’ analysis, or additional
analysis reflecting on others’ arguments. Each initial and replied message was analyzed
using the coding scheme.
Segmentation and coding. In this study, the idea unit of meaning, unit of analysis, was
defined as a cluster of statements related to the same essential meaning (Graneheim and
Lundman 2004). Before conducting the code, two coders, who are the authors of this study,
segmented the messages into idea units of meaning in consideration of the coding
scheme and reached a consensus on the segmentations. The two coders independently
carried out the content analysis based on the coding scheme. About 57% of data was
analyzed with this method and the initial inter-rater reliabilities were substantial (Cohen’s
kappa = 0.785) (Landis and Koch 1977). Once the reliability of the coding scheme was
established, we finalized the content analysis of the remaining data individually.

Scope and quantitative depth of discussion

To measure the scope of discussion, we identified topics discussed in the forums and
counted them. The more numbers of topics indicated the broader scope of discussion, and
the less, the narrower. We also measured the quantitative depth of discussion by counting
the number of messages related to a certain topic. Thus, the quantitative depth of dis-
cussion represented how many times students discussed the topic. While the level of
knowledge construction evaluated the quality of discussion, the scope and depth of dis-
cussion measured the quantity of discussion.

123
1490 K. Kwon et al.

Complexity and amount of information

The graphic organizers visualized the development of discussion topics in nodes and links.
A node contained the summary of the discussion in a box. Each node was connected with
other nodes having a related topic. A node could be connected with multiple nodes when
its topic was discussed in other nodes. A cluster is the collection of nodes linked together,
which illustrated the development of a topic.
We used the number of nodes and links to measure the complexity of graphic orga-
nizers. The more nodes a cluster has and the more links a node has, the more complex of a
graphic organizer is, and thus the less nodes and links, the less complex. The number of
clusters indicates the amount of information in two ways. First, it represents the number of
topics in general. Second, it describes how many times the topic was discussed. So, a
graphic organizer with many clusters indicates that students discussed many topics. A
cluster with many nodes explains that students discussed a certain topic many times.

Results

Overview of discussions

Table 3 shows the average number of posts and the average number of messages posted in
each learning condition for the two ID cases. It also tells the average number of words in a
message. The results suggest that there was no statistically significant difference regarding
the quantity of discussion between the cases. Based on the results, the following analysis
had been carried out without identifying cases.

Knowledge construction in discussions

Tables 4, 5, and 6 describe the frequency and proportion of meaning units that students
posted in the three different conditions: no-graphic organizers, instructor-provided graphic
organizers, and student-generated graphic organizers. Table 4 shows the results of all
messages that combined initial messages and replies and Tables 5 and 6 show the statistics
of the initial messages and replies separated, respectively.

Table 3 Overview of discussion activities in each ID case


ID case Condition Number Number Mean number Mean number of Mean number
of of of messages meaning units of words per
Groups Students per group per group meaning unit

CASE A Total 7 36 19.6 108.7 39.2


Control 3 17 23 117 37.7
Instructor 2 10 20.5 121 42.5
Student 2 9 13.5 83.5 37.5
CASE B Total 4 19 19.5 97.8 35.2
Instructor 2 9 21 100.5 37.2
Student 2 10 18 95 33.0

123
Graphic organizers in online discussions 1491

Table 4 Average number and proportion of meaning units posted by a student in all messages
Average number Average proportion F p

Control Instructor- Student- Control Instructor- Student-


provided generated (%) provided generated
(%) (%)

Initiate 11.6 (5.21) 12.4 (4.73) 9.9 (6.79) 57 54 50 .96 .39


Bring up 6.3 (4.55) 5.0 (4.16) 4.6 (4.98) 28 20 21 .67 .51
Restate 5.4 (2.40) 7.4 (3.64) 5.3 (3.64) 29 34 29 2.36 .11
Develop 7.7 (3.20) 9.3 (5.28) 7.4 (3.75) 38 39 40 1.05 .36
Elaborate 7.7 (3.04) 7.3 (4.42) 5.5 (3.13) 38 30 29 1.91 .16
Alternative .06 (.24) 2.0 (1.97) 2.0 (1.47) 0 9 11 10.2 .00
Construct 1.1 (.97) 1.5 (1.43) 1.3 (1.29) 5 7 10 .62 .54
Evaluate .65 (.86) .63 (.96) .74 (.93) 4 2 6 .07 .93
Synthesize .18 (.53) .21 (.42) .11 (.32) 1 1 1 .30 .74
Reflect .24 (.44) .68 (.89) .47 (.70) 1 3 3 1.82 .17
The values in the average number columns represent the mean of meaning units posted by a student and its
standard deviation (in parenthesis) of each condition; The percentages were calculated the proportion within
the conditions: control, instructor, and student

Table 5 Average number and proportion of meaning units posted by a student in initial messages
Average number Average proportion F p

Control Instructor- Student- Control Instructor- Student-


provided generated (%) provided (%) generated
(%)

Initiate 6.1 (2.98) 9.2 (1.14) 8.4 (5.89) 59 67 63 1.91 .16


Bring up 3.7 (3.41) 3.8 (3.55) 4.1 (4.81) 31 27 29 .06 .94
Restate 2.5 (1.94) 5.3 (4.07) 4.1 (3.07) 28 40 35 3.59 .04
Develop 4.1 (1.80) 4.5 (3.03) 4.5 (2.72) 41 32 37 .13 .88
Elaborate 4.1 (1.80) 3.8 (2.71) 3.6 (2.29) 41 28 31 .20 .82
Alternative 0 (0) .5 (.77) .6 (.76) 0 4 5 4.97 .01
Construct .06 (.24) .11 (.32) .05 (.23) 1 1 0 .22 .80
Evaluate 0 (0) .05 (.23) 0 (0) 0 0 0 .95 .40
Synthesize .06 (.24) .05 (.23) 0 (0) 1 1 0 .53 .59
Reflect 0 (0) 0 (0) .05 (.23) 0 0 0 .95 .40

The values in the average number columns represent the mean of meaning units posted by a student and its
standard deviation (in parenthesis) of each condition; The percentages were calculated the proportion within
the conditions: control, instructor, and student

Levels of knowledge construction

Regarding the levels of knowledge construction: initiate, develop, and construct, no sta-
tistically significant difference was found across the conditions in general. However, a
significant main effect of learning conditions was found in the ‘‘initiate’’ level where

123
1492 K. Kwon et al.

Table 6 Average number and proportion of meaning units posted by a student in replies
Average number Average proportion F p

Control Instructor- Student- Control Instructor- Student-


provided generated (%) provided generated
(%) (%)

Initiate 5.5 (3.02) 3.2 (2.25) 1.5 (1.50) 52 34 25 13.83 .00


Bring up 2.6 (1.97) 1.2 (1.61) .5 (.61) 23 10 8 9.90 .00
Restate 2.9 (1.93) 2.1 (1.39) 1.2 (1.40) 30 24 18 5.03 .01
Develop 3.6 (2.43) 4.8 (3.52) 2.9 (2.16) 36 47 50 2.26 .12
Elaborate 3.5 (2.29) 3.4 (2.57) 1.8 (1.57) 36 33 29 3.48 .04
Alternative .1 (.24) 1.5 (1.74) 1.3 (1.20) 0 14 21 6.74 .00
Construct 1 (.94) 1.4 (1.30) 1.3 (1.33) 11 19 25 .55 .58
Evaluate .7 (.86) .6 (.84) .7 (.93) 9 5 14 .15 .86
Synthesize .1 (.33) .2 (.38) .1 (.32) 1 2 1 .12 .89
Reflect .2 (.44) .7 (.89) .4 (.69) 2 12 9 1.86 .17
The values in the average number columns represent the mean of meaning units posted by a student and its
standard deviation (in parenthesis) of each condition; The percentages were calculated the proportion within
the conditions: control, instructor, and student

students replied to other’s posts, F(2, 52) = 13.83, p \ .001, g2p = .35 (see Table 6). Mean
comparisons performed as post hoc analysis, using Bonferroni correction for multiple
comparisons, showed that students who did not have graphic organizers posted more
‘‘initiate’’ level messages (M = 5.5) when they replied to others’ posts than those who
received instructor-provided graphic organizers (M = 3.2) or generated graphic organizers
by themselves (M = 1.5), p = .01 and p \ .001, respectively. As Fig. 4 describes, con-
sidering the proportion of messages on each knowledge construction level, we found a

60

50

40 Control

30 Instructor-
provided
20
Student-
generated
10

0
Iniate Develop Construct
Fig. 4 The proportion of knowledge construction levels of replies

123
Graphic organizers in online discussions 1493

clear pattern that those who interacted with graphic organizers posted the higher proportion
of replies that belonged to the develop and the construct levels while those who did not
have the graphic organizers posted the highest proportion of messages in the initiate level.
The results suggest that either receiving or generating graphic organizers regarding dis-
cussion topics facilitated students’ higher levels of cognitive engagement in discussion.

Types of discussion

Regarding the types of discussion: bring up, restate, elaborate, alternative, evaluate, syn-
thesize, and reflect, we carried out analysis of variances (ANOVAs) to examine the dif-
ferences depending on the learning conditions. ANOVAs on the ‘‘alternative’’ type
revealed significant main effects of learning conditions in all messages, F(2, 52) = 10.20,
p \ .001, g2p = .28, in initial messages, F(2, 52) = 4.97, p = .01, g2p = .16, and in replies
F(2, 52) = 6.74, p \ .01, g2p = .21 (see Tables 4, 5, and 6). A post hoc analysis of all
messages revealed that students who received instructor-provided graphic organizers
(M = 2.0) or generated graphic organizers (M = 2.0) posted more ‘‘alternative’’ type
messages than those who did not have graphic organizers (M = 0.1), p = .001 in both
cases. A post hoc analysis of initial messages revealed that students who generated graphic
organizers (M = 0.6) posted more ‘‘alternative’’ type initial messages than those who did
not have graphic organizers (M = 0), p = .014. A post hoc analysis of replies revealed that
students who received instructor-provided graphic organizers (M = 1.5) or generated them
(M = 1.3) posted more ‘‘alternative’’ type replies than those who did not have graphic
organizers (M = 0.1), p = .004 and p = .012, respectively. The results suggest that either
receiving or generating graphic organizers regarding discussion topics encouraged students
to consider alternative views while participating in the discussion.
ANOVAs of ‘‘bring up’’ and ‘‘restate’’ in the replies also revealed significant main
effects of learning conditions, F(2, 52) = 9.90, p \ .001, g2p = .28, and F(2, 52) = 5.03,
p = .01, g2p = .16, respectively (see Table 6). Regarding the ‘‘bring up’’ type replies, a post
hoc analysis revealed that students who did not have graphic organizers replied to others
with the more ‘‘bring up’’ messages (M = 2.6) than those who received instructor-provided
graphic organizers (M = 1.2) or generated them (M = .5), p = .012 and p \ .001,
respectively. Regarding the ‘‘restate’’ type replies, a post hoc analysis revealed that stu-
dents who did not have graphic organizers replied to others with the more restate-type
messages (M = 2.9) than those who generated graphic organizers (M = 1.2), p = .008.
These results suggest that students discussed more new topics or restated previous mes-
sages while replying to others’ messages if they did not have graphic organizers.
A statistically significant main effect of learning conditions was found in the ‘‘restate’’
of the initial messages, F(2, 52) = 3.59, p = .035, g2p = .12 (see Table 5). A post hoc
analysis revealed that students who received instructor-provided graphic organizers posted
more initial messages that restated previous topics (M = 5.3) than those who did not have
graphic organizers (M = 2.5), p = .03. These results suggest that students made the con-
nection with previous messages while initiating new discussion threads if they received
instructor-provided graphic organizers.
Although a statistically significant main effect of learning conditions was found in the
‘‘elaborate’’ of replies, F(2, 52) = 3.48, p = .038, g2p = .12, a post hoc analysis did not
reveal any statistically significant difference between the conditions (see Table 6).

123
1494 K. Kwon et al.

Scope and quantitative depth of discussion

Students discussed 76 topics in total. We examined how many topics each student dis-
cussed in average (scope of discussion). We also examined how many messages each
student posted on a certain topic in average (quantitative depth of discussion). Tables 7 and
8 display the statistics of the scope and quantitative depth of discussions across the learning
conditions.
We examined the differences in the scope and quantitative depth of discussion between
the learning conditions. Regarding the scope of discussion, ANOVAs revealed significant
main effects of learning conditions in all messages, F(2, 52) = 6.19, p \ .001, g2p = .19, in
initial messages, F(2, 52) = 5.04, p = .01, g2p = .17, and in replies F(2, 52) = 3.36, p = .04,
g2p = .12 (see Table 7). A post hoc analysis of all messages revealed that students who
received instructor-provided graphic organizers (M = 10.2) discussed more topics than
those who generated graphic organizers (M = 7.2) and who did not have them (M = 7.4),
p \ .01 and p = .017, respectively. A post hoc analysis of initial messages revealed that
students who received instructor-provided graphic organizers discussed more topics
(M = 7.3) than those who did not have graphic organizers (M = 4.5), p \ .01 in their
initial messages. A post hoc analysis of replies revealed that students who received
instructor-provided graphic organizers (M = 4.8) discussed more topics when they replied
to other’s messages than those who generated the graphic organizers (M = 2.9), p = .04.
Regarding the quantitative depth of discussion, we did not find any significant main effects
of learning conditions (see Table 8). These results suggest that students considered more
topics while participating in the discussion when they received instructor-provided graphic
organizers.
Based on these results, we predicted that the instructor-provided graphic organizers
would represent more topics discussed in the forums while the student-generated graphic
organizers would contain fewer topics by being selective to certain topics they mainly
discussed. The rationale for this prediction is that the instructor tried to represent the
overview of discussion from the observer’s perspective while students collectively gen-
erated the graphic organizers by updating parts that they discussed in the forums. There-
fore, it was probable that the instructor-provided graphic organizers would include minor
topics that students discussed less often in addition to the main topics. In contrast, the
students-generated graphic organizers would be selective to the main topics. In order to test
the hypothesis, researchers analyzed the graphic organizers in terms of the complexity and
the amount of information. The complexity of graphic organizers was measured by the
numbers of nodes and the average number of links of each node. The amount of infor-
mation was measured by the number of clusters (collections of nodes linked together) and
nodes (see Table 9).

Table 7 Average number of topic discussed by a student


Control Instructor-provided Student-generated F p

All messages 7.4 (2.89) 10.2 (3.39) 7.2 (2.57) 6.19 .00
Initial messages 4.5 (1.70) 7.3 (3.16) 5.7 (2.61) 5.04 .01
Replies 4.3 (2.38) 4.8 (2.34) 2.9 (1.39) 3.36 .04

123
Graphic organizers in online discussions 1495

Table 8 Average number of messages posted per topic by a student


Control Instructor Student F p

All messages 3.0 (1.15) 2.3 (.53) 2.7 (.80) 2.38 .10
Initial messages 2.5 (1.17) 2.2 (.89) 2.6 (1.13) 2.20 .12
Replies 2.6 (.96) 2.0 (.52) 2.6 (1.23) .68 .51

Table 9 Average number of clusters, nodes, and links in a graphic organizer


Total nodes Links per node Clusters Nodes per cluster

Instructor-provided 20.3 1.56 6.3 3.2


Student-generated 18.8 1.79 3 6.2

Regarding the complexity, the instructor-provided graphic organizers contained 20.3


nodes and each node had 1.56 links. The student-generated graphic organizers contained
18.8 nodes and each node had 1.79 links. It is hard to say there was a meaningful difference
between two graphic organizers in terms of the complexity. However, regarding the
amount of information, instructor-provided visuals displayed more information than stu-
dent-generated ones. On average, instructor-provided graphic organizers contained 6.3
clusters including 3.2 nodes per cluster while student-generated graphic organizers con-
tained 3 clusters including 6.2 nodes per cluster. Considering that a cluster consisted of the
same discussion topic, this result reveals that instructor-provided graphic organizers dis-
played more topics with fewer nodes while student-generated graphic organizers displayed
fewer topics with more nodes. A content analysis of each node also reveals that instructor-
provided graphic organizers represented more topics (M = 17.3) than student-generated
ones (M = 9.3). The total number of ideas displayed in instructor-provided graphic
organizers (M = 45.8) also outnumbered student-generated ones (mean = 20.1).

Discussion

The purpose of this study was to investigate the effects of receiving or generating graphic
organizers on the level of knowledge construction in online discussions. The main findings
of this study can be summarized as follows: (1) Both generating and receiving graphic
organizers facilitated students’ higher levels of cognitive engagement when students
replied to messages, and encouraged them to consider alternative views when they
developed initial messages and replied to other messages. (2) If students did not have
graphic organizers, they tended to summarize previous messages by restating their argu-
ments or bring up new topics when they replied to others’ messages. (3) When students
received the instructor’s graphic organizers, they discussed more topics in their initial
messages as well as in their replies.

123
1496 K. Kwon et al.

Graphic organizer effects

One significant finding is that graphic organizers affected students’ cognitive engagement
in positive ways during the discussions. The results revealed that students in both gener-
ating and receiving conditions responded to each other’s messages by elaborating argu-
ments or suggesting alternative views more actively. These results are consistent with the
previous studies that showed the positive effects of graphic organizers on students’ cog-
nitive engagement and critical thinking process (Iandoli et al. 2014; Suthers and Hund-
hausen 2003). From a social constructivist perspective, Jonassen et al. (1995) argued that
learning occurs when students construct meaning through interactions with others, such as
negotiating different ideas and reflecting on their own. Considering multiple aspects and
evaluating rationales are some of the most valuable learning experiences derived from
discussions that instructors commonly want to facilitate (Nussbaum and Schraw 2007). In
this study, the two types of graphic organizers enhanced students’ positive learning
experiences by fostering their ability to see alternative views.
Furthermore, this study supports the previous findings that without proper instructional
intervention, students tend to exchange superficial messages that lack of sufficient elab-
oration or critical debates (Gunawardena et al. 1997; Heckman and Annabi 2005;
McLoughlin and Luca 2000). The students in the control group tended to restate previous
arguments or raised new issues. Accordingly, many researchers have suggested that well-
designed instructional intervention such as graphic organizers, question prompts, and
specification of message types might be effective for students who are not able to engage in
higher-order thinking processes without specific guidance in online discussions (Bradley
et al. 2008; Jeong and Joung 2007; Kwon and Park 2017; Nussbaum and Schraw 2007). In
this study, graphic organizers promoted higher-order thinking processes measured by the
higher level of knowledge construction during collaborative discussions.
Graphic organizers, when students either received or generated, seemed to promote
students’ consideration of alternative views. In this study, the students could see the
relations among arguments as illustrated by the graphic organizers, which might have
encouraged them to consider multiple perspectives. This is in line with the findings of
Leutner et al. (2009) that graphic organizers make the relations visually salient and save
cognitive capacity so that students can engage in generative activities that include con-
sidering alternative views. Given the graphic organizers, students could identify the con-
nections of multiple ideas more easily and evaluated the relations in an integrated manner
(Robinson and Kiewra 1995). By reviewing graphic organizers, students were also able to
view the entire discussion and spot which topics were being mainly discussed at a glance
(Gibbs et al. 2006).
Graphic organizers might also have enhanced a mutual awareness of discussion topics.
By illustrating opinions and related resources, the graphic organizer aided students in
explicitly sharing their positions and rationales with peers. The graphic organizer served as
an external group coordination tool, illustrating individuals’ ideas in a common space
(Eryilmaz et al. 2013). In this study, the shared awareness could have affected the students’
co-construction of meaning by prompting them to elaborate ideas through the considera-
tion and evaluation of peers’ multiple perspectives.
With regards to the cognitive load theory, it is assumed that graphic organizers reduce
the learners’ cognitive demands of understanding of text-based messages. The text-based
messages students posted on online discussion forums expressed ideas in depth but did not
make the relations between arguments visually salient (Suthers 2014). The structure of

123
Graphic organizers in online discussions 1497

threaded discussion forums explicitly connects the replies within a thread but excludes any
connection to ideas out of the thread. This exclusiveness hinders students from organizing
overall discussion flow and guides them to focus on the messages only within the thread
(Murphy and Coleman 2004). In this context, students need to hold information about what
other peers discussed to develop their arguments, which is a cognitively demanding task
because they need to mentally draw the relations during the process (Eryilmaz et al. 2013).
In general, threaded discussion forums have a limited capacity to ‘‘facilitate quick and
clean overviews of busy forums’’ (Jyothi et al. 2012, p. 31). Thus, graphic organizers,
either provided by the instructor or generated by the students in this study, seemed to
overcome the limitations of threaded discussion forums by presenting an overview of
forum messages and illustrating the relations between arguments discussed across dis-
cussion threads.

Generating versus receiving graphic organizers

The current study examined the effects of graphic organizers from two different approa-
ches: instructor-provided versus student-generated graphic organizers. As a result, any
significant main impact of the approaches was not confirmed regarding the level of
knowledge construction and the types of discussion. Regarding the scope of discussion,
however, students discussed more topics when they were given graphic organizers that the
instructor created rather than when they generated them. Why did students discuss more
topics when given instructor-provided graphic organizers? It might be due to the difference
of information represented in the graphic organizers. Our analysis of the two types of
graphic organizers shows that the instructor-provided graphic organizers that contained
more clusters displaying more topics than the student-generated ones in general. On the
contrary, the student-generated graphic organizers had less clusters yet more nodes, which
indicates fewer numbers of topics had been discussed more intensively. The results suggest
that instructor-provided graphic organizers represented the broad overview of the discus-
sion, whereas student-generated graphic organizers reflected students’ selective attention
toward discussion and emphasized a few topics mainly discussed.
We can also answer the question from the activity theory perspective as well as the
cognitive load theory. The activity theory claims that learning occurs ‘‘base[d] on the
interaction of conscious meaning making and activity, which are dynamically evolving’’
(Jonassen 2000, p. 105). The graphic organizer generated by students was the part of
activity system where students interacted with each other in pursuing a shared goal. In the
activity system, students used the graphic organizer as a tool to express their ideas and
understand others’ ideas, which facilitated the construction of socially shared meanings
(Jonassen and Rohrer-Murphy 1999). The results suggest that students added nodes with
reference to the previous nodes when generating the graphic organizers, which was
actualized in their discussion activity by developing arguments in conjunction with prior
discussions. In the students’ learning context, therefore, generating a graphic organizer
seemed to affect the students’ discussion process, as it graphically manifested the dis-
cussion process. Thus, it is reasonable to conclude that generating graphic organizers
allowed students to update the graphic organizers focusing on the topics they were more
interested in, and this drew other students’ attention to the particular topics reciprocally.
On the other hand, when given graphic organizers created by the instructor, students did
not update the graphic organizers, instead simply reviewed them to capture the current flow
of the discussion. While the instructor took an active role in generating the graphic

123
1498 K. Kwon et al.

organizer, the students took a consumer role and took the advantage of the graphic
organizers. One critical advantage of receiving the graphic organizers the instructor created
was to review expert’s mental model that organized and summarized discussion from the
instructor’s perspectives (Stull and Mayer 2007). Based on the expert model, students were
able to have the overview of discussions rather than focused on a few specific topics. In this
context, instructor-provided graphic organizers seemed to have extended students’ atten-
tion to broad topics.
Cognitive load theory explains the phenomenon based on the students’ limited cognitive
capacity, and the cognitive demands of online discussion and related learning activities
(Sweller 1994). Generating graphic organizers would require higher cognitive demands
than receiving ones. While creating graphic organizers, students should hold considerable
information such as issues of previous messages, relations of them, and new arguments
they wanted to add, which would easily consume the limited cognitive capacity. Thus, it is
reasonable that students had no choice, but to focus on a few topics they wanted to discuss.
Contrarily, when students had the instructor-provided graphic organizers, they could
release the cognitive burden and save their cognitive capacity to consider various topics
and their relations. Thus, it is not surprising that students discussed more topics when they
were given (rather than generated) graphic organizers.
The result was noteworthy in that covering more or fewer topics was not associated with
the quality of discussion as to the levels of knowledge construction. So, we need to be
cautious in concluding the effect of different approaches toward the graphic organizers.
Although the graphic organizers had the same positive impact on facilitating the higher
level of knowledge construction, such as elaboration of ideas and consideration of alter-
native views, students’ learning activities were quite different while students interacted
with the different types of graphic organizers by either generating or receiving.

Implication and future study

The findings of this study highlighted the role of graphic organizers as crucial tools to
foster students’ cognitive engagement in collaborative online discussions. In general,
graphic organizers support students’ cognitive and metacognitive skills to reduce cognitive
difficulties and to facilitate knowledge constructions (Chandler and Sweller 1991; Mayer
2004; Mayer and Moreno 2003). However, few studies have examined how to utilize them
in a way to positively impact students’ knowledge constructions during collaborative
discussions. Given the importance of reducing students’ cognitive demands, the design and
use of graphic organizers should consider students’ cognitive capacity as well as infor-
mation processing. Especially, learners may face challenges in online learning environ-
ments due to complexities that may impose cognitive burdens to learners. The findings
from this study provide a better understanding of how to support collaborative discussions
in online learning environments.
There are some limitations in this study that are worth investigating in future research.
First, although this study showed that the instructor-provided graphic organizers impacted
students’ knowledge construction, each student’s mental model might differently affect the
effectiveness of the graphic organizers. Given that individual students had different mental
models, and constructed and reproduced knowledge based on the retrieval of what they had
experienced and learned, it might bring different benefits to individual learners depending
on how they processed the graphic organizers. Thus, further study is necessary to explore

123
Graphic organizers in online discussions 1499

how students perceive and utilize instructor-provided graphic organizers in ways that may
or may not be consistent with their mental model.
Second, further investigations should examine why students discussed more topics
while using the instructor-provided graphic organizers than when creating graphic orga-
nizers. Although this study provides convincing explanations based on the cognitive theory
and the activity theory, we did not provide evidence of why students covered broader
topics with the instructor-provided graphic organizers. Further investigation is necessary to
provide a more in-depth explanation regarding the issue.
In addition, our study suggests that creating a graphic organizer may serve as a scaffold
to facilitate students’ knowledge construction and positively impact students’ cognitive
engagement. While creating graphic organizers, students may experience difficulty ana-
lyzing and representing main ideas, managing information and evaluating relevant argu-
mentations since discussions are complex, open-ended and have multiple perspectives
(Bean et al. 1986; Leutner et al. 2009). Thus, students may require additional guidelines to
help them in overcoming these difficulties. Given that students interact with technology
tools in dynamic ways, providing proper guidelines and scaffolding should be essential to
facilitate quality discussions in online learning environments.
Lastly, this study has a limitation to its external generalizability because of the small
sample size and the lack of randomization in grouping students. Thus, one needs to be
cautious to generalize the findings.
Compliance with ethical standards

Conflict of interest The authors declare that they have no conflict of interest.

Ethical approval All procedures performed in studies involving human participants were in accordance with
the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki
declaration and its later amendments or comparable ethical standards.

References
Alavi, M., & Tiwana, A. (2002). Knowledge integration in virtual teams: The potential role of KMS.
Journal of the American Society for Information Science and Technology, 53(12), 1029–1037. https://
doi.org/10.1002/asi.10107.
Anderson, T., Howe, C., Soden, R., Halliday, J., & Low, J. (2001). Peer interaction and the learning of
critical thinking skills in further education students. Instructional Science, 29(1), 1–32. https://doi.org/
10.1023/a:1026471702353.
Ausubel, D. P. (1960). The use of advance organizers in the learning and retention of meaningful verbal
material. Journal of Educational Psychology, 51(5), 267–272. https://doi.org/10.1037/h0046669.
Bean, T. W., Singer, H., Sorter, J., & Frazee, C. (1986). The effect of metacognitive instruction in outlining
and graphic organizer construction on students’ comprehension in a tenth-grade world history class.
Journal of Reading Behavior, 18(2), 153–169. https://doi.org/10.1080/10862968609547562.
Beckmann, J., & Weber, P. (2016). Cognitive presence in virtual collaborative learning: Assessing and
improving critical thinking in online discussion forums. Interactive Technology and Smart Education,
13(1), 52–70. https://doi.org/10.1108/ITSE-12-2015-0034.
Blunt, J. R., & Karpicke, J. D. (2014). Learning with retrieval-based concept mapping. Journal of Edu-
cational Psychology, 106(3), 849–858. https://doi.org/10.1037/a0035934.
Bradley, M. E., Thom, L. R., Hayes, J., & Hay, C. (2008). Ask and you will receive: How question type
influences quantity and quality of online discussions. British Journal of Educational Technology,
39(5), 888–900. https://doi.org/10.1111/j.1467-8535.2007.00804.x.
Butcher, K. R. (2006). Learning from text with diagrams: Promoting mental model development and
inference generation. Journal of Educational Psychology, 98(1), 182. https://doi.org/10.1037/0022-
0663.98.1.182.

123
1500 K. Kwon et al.

Cable, G. (2001). Enhancing causal interpretations of quality improvement interventions. Quality in Health
Care, 10(3), 179–186. https://doi.org/10.1136/qhc.0100179.
Celentin, P. (2007). Online education: Analysis of interaction and knowledge building patterns among
foreign language teachers. International Journal of E-Learning & Distance Education, 21(3), 39–58.
Chandler, P., & Sweller, J. (1991). Cognitive load theory and the format of instruction. Cognition and
Instruction, 8(4), 293–332. https://doi.org/10.1207/s1532690xci0804_2.
Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. Journal of the
Learning Sciences, 6(3), 271–315. https://doi.org/10.1207/s15327809jls0603_1.
Davis, E. A. (2000). Scaffolding students’ knowledge integration: Prompts for reflection in KIE. Interna-
tional Journal of Science Education, 22(8), 819–837. https://doi.org/10.1080/095006900412293.
De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis schemes to analyze
transcripts of online asynchronous discussion groups: A review. Computers & Education, 46(1), 6–28.
https://doi.org/10.1016/j.compedu.2005.04.005.
DiCecco, V. M., & Gleason, M. M. (2002). Using graphic organizers to attain relational knowledge from
expository text. Journal of Learning Disabilities, 35(4), 306–320. https://doi.org/10.1177/
00222194020350040201.
Erickson, T., Halverson, C., Kellogg, W. A., Laff, M., & Wolf, T. (2002). Social translucence: Designing
social infrastructures that make collective activity visible. Communications of the ACM, 45(4), 40–44.
https://doi.org/10.1145/505248.505270.
Ertmer, P. A., Quinn, J., & Glazewski, K. D. (2013). The ID casebook: Case studies in instructional design
(4th ed.). Boston: Pearson Education Inc.
Eryilmaz, E., Van der Pol, J., Ryan, T., Clark, P., & Mary, J. (2013). Enhancing student knowledge
acquisition from online learning conversations. International Journal of Computer-Supported Col-
laborative Learning, 8(1), 113–144. https://doi.org/10.1007/s11412-012-9163-y.
Figueira, Á. R., & Laranjeiro, J. B. (2007). Interaction visualization in web-based learning using igraphs.
Paper presented at the the ACM conference on Hypertext and hypermedia, Manchester, UK.
Frey, B. A., Sass, M. S., & Alman, S. W. (2006). Mapping MLIS asynchronous discussions. International
Journal of Instructional Technology and Distance Learning, 3(1), 3–16.
Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer
conferencing in distance education. American Journal of Distance Education, 15(1), 7–23. https://doi.
org/10.1080/08923640109527071.
Gibbs, W. J., Olexa, V., & Bernas, R. S. (2006). A visualization tool for managing and studying online
communications. Journal of Educational Technology & Society, 9(3), 232–243.
Graneheim, U. H., & Lundman, B. (2004). Qualitative content analysis in nursing research: Concepts,
procedures and measures to achieve trustworthiness. Nurse Education Today, 24(2), 105–112. https://
doi.org/10.1016/j.nedt.2003.10.001.
Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of a global online debate and the
development of an interaction analysis model for examining social construction of knowledge in
computer conferencing. Journal of Educational Computing Research, 17(4), 397–431. https://doi.org/
10.2190/7MQV-X9UJ-C7Q3-NRAG.
Hara, N., Bonk, C., & Angeli, C. (2000). Content analysis of online discussion in an applied educational
psychology course. Instructional Science, 28(2), 115–152. https://doi.org/10.1023/A:1003764722829.
Heckman, R., & Annabi, H. (2005). A content analytic comparison of learning processes in online and face-
to-face case study discussions. Journal of Computer-Mediated Communication. https://doi.org/10.
1111/j.1083-6101.2005.tb00244.x.
Henri, F. (1992). Computer conferencing and content analysis. In A. Kaye (Ed.), Collaborative learning
through computer conferencing: The Najaden papers (Vol. 90, pp. 117–136). Berlin: Springer.
Heo, H., Lim, K. Y., & Kim, Y. (2010). Exploratory study on the patterns of online interaction and
knowledge co-construction in project-based learning. Computers & Education, 55(3), 1383–1392.
https://doi.org/10.1016/j.compedu.2010.06.012.
Hewitt, J. (2005). Toward an understanding of how threads die in asynchronous computer conferences.
Journal of the Learning Sciences, 14(4), 567–589. https://doi.org/10.1207/s15327809jls1404_4.
Iandoli, L., Quinto, I., De Liddo, A., & Buckingham Shum, S. (2014). Socially augmented argumentation
tools: Rationale, design and evaluation of a debate dashboard. International Journal of Human-
Computer Studies, 72(3), 298–319. https://doi.org/10.1016/j.ijhcs.2013.08.006.
Järvelä, S., & Häkkinen, P. (2002). Web-based cases in teaching and learning—The quality of discussions
and a stage of perspective taking in asynchronous communication. Interactive Learning Environments,
10(1), 1–22. https://doi.org/10.1076/ilee.10.1.1.3613.

123
Graphic organizers in online discussions 1501

Jeong, A., & Joung, S. (2007). Scaffolding collaborative argumentation in asynchronous discussions with
message constraints and message labels. Computers & Education, 48(3), 427–445. https://doi.org/10.
1016/j.compedu.2005.02.002.
Jonassen, D. H. (2000). Revisiting activity theory as a framework for designing student-centered learning
environments. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environ-
ments (pp. 89–121). Mahwah, New Jersey: Lawrence Erlbaum.
Jonassen, D. H., Davidson, M., Collins, M., Campbell, J., & Haag, B. B. (1995). Constructivism and
computer-mediated communication in distance education. American Journal of Distance Education,
9(2), 7–26. https://doi.org/10.1080/08923649509526885.
Jonassen, D. H., & Rohrer-Murphy, L. (1999). Activity theory as a framework for designing constructivist
learning environments. Educational Technology Research and Development, 47(1), 61–79. https://doi.
org/10.1007/bf02299477.
Jyothi, S., McAvinia, C., & Keating, J. (2012). A visualisation tool to aid exploration of students’ inter-
actions in asynchronous online communication. Computers & Education, 58(1), 30–42. https://doi.org/
10.1016/j.compedu.2011.08.026.
Keller, T., Gerjets, P., Scheiter, K., & Garsoffky, B. (2006). Information visualizations for knowledge
acquisition: The impact of dimensionality and color coding. Computers in Human Behavior, 22(1),
43–65. https://doi.org/10.1016/j.chb.2005.01.006.
Kiewra, K. A., Kauffman, D. F., Robinson, D. H., Dubois, N. F., & Staley, R. K. (1999). Supplementing
floundering text with adjunct displays. Instructional Science, 27(5), 373–401. https://doi.org/10.1023/a:
1003270723360.
Kwon, K., & Park, S. J. (2017). Effects of discussion representation: Comparisons between social and
cognitive diagrams. Instructional Science, 45(4), 469–491. https://doi.org/10.1007/s11251-017-9412-6.
Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Bio-
metrics, 33(1), 159–174. https://doi.org/10.2307/2529310.
Larkin, J. H., & Simon, H. A. (1987). Why a diagram is (sometimes) worth ten thousand words. Cognitive
Science, 11(1), 65–100. https://doi.org/10.1111/j.1551-6708.1987.tb00863.x.
Lazonder, A. W., Wilhelm, P., & Ootes, S. A. W. (2003). Using sentence openers to foster student inter-
action in computer-mediated learning environments. Computers & Education, 41(3), 291–308. https://
doi.org/10.1016/S0360-1315(03)00050-2.
Leopold, C., & Leutner, D. (2012). Science text comprehension: Drawing, main idea selection, and sum-
marizing as learning strategies. Learning and Instruction, 22(1), 16–26. https://doi.org/10.1016/j.
learninstruc.2011.05.005.
Leutner, D., Leopold, C., & Sumfleth, E. (2009). Cognitive load and science text comprehension: Effects of
drawing and mentally imagining text content. Computers in Human Behavior, 25(2), 284–289. https://
doi.org/10.1016/j.chb.2008.12.010.
Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psy-
chologist, 59(1), 14–19. https://doi.org/10.1037/0003-066X.59.1.14.
Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational
Psychologist, 38(1), 43–52. https://doi.org/10.1207/S15326985EP3801_6.
McLoughlin, C., & Luca, J. (2000). Cognitive engagement and higher order thinking through computer
conferencing: We know why but do we know how? The 9th Annual Teaching Learning Forum.
Retrieved from http://ctl.curtin.edu.au/events/conferences/tlf/tlf2000/mcloughlin.html.
Munneke, L., van Amelsvoort, M., & Andriessen, J. (2003). The role of diagrams in collaborative argu-
mentation-based learning. International Journal of Educational Research, 39(1–2), 113–131. https://
doi.org/10.1016/S0883-0355(03)00076-4.
Murphy, E., & Coleman, E. (2004). Graduate students’ experiences of challenges in online asynchronous
discussions. Canadian Journal Of Learning And Technology. https://doi.org/10.21432/T27G7N.
Newman, D. R., Webb, B., & Cochrane, C. (1995). A content analysis method to measure critical thinking in
face-to-face and computer supported group learning. Interpersonal Computing and Technology, 3(2),
56–77.
Novak, J. D. (1990). Concept maps and Vee diagrams: Two metacognitive tools to facilitate meaningful
learning. Instructional Science, 19(1), 29–52. https://doi.org/10.1007/BF00377984.
Nussbaum, E. M., & Schraw, G. (2007). Promoting argument-counterargument integration in students’
writing. The Journal of Experimental Education, 76(1), 59–92. https://doi.org/10.3200/JEXE.76.1.59-
92.
Nussbaum, E. M., Winsor, D., Aqui, Y., & Poliquin, A. (2007). Putting the pieces together: Online argu-
mentation vee diagrams enhance thinking during discussions. International Journal of Computer-
Supported Collaborative Learning, 2(4), 479–500. https://doi.org/10.1007/s11412-007-9025-1.

123
1502 K. Kwon et al.

O’Donnell, A. M., Dansereau, D. F., & Hall, R. H. (2002). Knowledge maps as scaffolds for cognitive
processing. Educational Psychology Review, 14(1), 71–86. https://doi.org/10.1023/a:1013132527007.
Pena-Shaff, J. B., & Nicholls, C. (2004). Analyzing student interactions and meaning construction in
computer bulletin board discussions. Computers & Education, 42(3), 243–265. https://doi.org/10.1016/
j.compedu.2003.08.003.
Robinson, D. H., & Kiewra, K. A. (1995). Visual argument: Graphic organizers are superior to outlines in
improving learning from text. Journal of Educational Psychology, 87(3), 455–467. https://doi.org/10.
1037/0022-0663.87.3.455.
Robinson, D. H., & Skinner, C. H. (1996). Why graphic organizers facilitate search processes: Fewer words
or computationally efficient indexing? Contemporary Educational Psychology, 21(2), 166–180. https://
doi.org/10.1006/ceps.1996.0014.
Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (2007). Assessing social presence in asynchronous
text-based computer conferencing. The Journal of Distance Education, 14(2), 50–71.
Rovai, A. P. (2007). Facilitating online discussions effectively. The Internet and Higher Education, 10(1),
77–88. https://doi.org/10.1016/j.iheduc.2006.10.001.
Schellens, T., & Valcke, M. (2005). Collaborative learning in asynchronous discussion groups: What about
the impact on cognitive processing? Computers in Human Behavior, 21(6), 957–975. https://doi.org/
10.1016/j.chb.2004.02.025.
Schmeck, A., Mayer, R. E., Opfermann, M., Pfeiffer, V., & Leutner, D. (2014). Drawing pictures during
learning from scientific text: Testing the generative drawing effect and the prognostic drawing effect.
Contemporary Educational Psychology, 39(4), 275–286. https://doi.org/10.1016/j.cedpsych.2014.07.
003.
Schwamborn, A., Thillmann, H., Opfermann, M., & Leutner, D. (2011). Cognitive load and instructionally
supported learning with provided and learner-generated visualizations. Computers in Human Behavior,
27(1), 89–93. https://doi.org/10.1016/j.chb.2010.05.028.
Seufert, T. (2003). Supporting coherence formation in learning from multiple representations. Learning and
Instruction, 13(2), 227–237. https://doi.org/10.1016/S0959-4752(02)00022-1.
Stull, A. T., & Mayer, R. E. (2007). Learning by doing versus learning by viewing: Three experimental
comparisons of learner-generated versus author-provided graphic organizers. Journal of Educational
Psychology, 99(4), 808–820. https://doi.org/10.1037/0022-0663.99.4.808.
Suresh, K. P. (2011). An overview of randomization techniques: An unbiased assessment of outcome in
clinical research. Journal of Human Reproductive Sciences, 4(1), 8–11. https://doi.org/10.4103/0974-
1208.82352.
Suthers, D. D. (2014). Empirical studies of the value of conceptually explicit notations in collaborative
learning. In A. Okada, S. J. Buckingham-Shum, & T. Sherborne (Eds.), Knowledge cartography:
Software tools and mapping techniques (pp. 1–22). London: Springer.
Suthers, D. D., & Hundhausen, C. D. (2003). An experimental study of the effects of representational
guidance on collaborative learning processes. Journal of the Learning Sciences, 12(2), 183–218.
https://doi.org/10.1207/S15327809JLS1202_2.
Sweller, J. (1994). Cognitive load theory, learning difficulty, and instructional design. Learning and
Instruction, 4(4), 295–312. https://doi.org/10.1016/0959-4752(94)90003-5.
Thomas, M. J. W. (2002). Learning within incoherent structures: The space of online discussion forums.
Journal of Computer Assisted Learning, 18(3), 351–366. https://doi.org/10.1046/j.0266-4909.2002.
03800.x.
Van Meter, P., & Garner, J. (2005). The promise and practice of learner-generated drawing: Literature
review and synthesis. Educational Psychology Review, 17(4), 285–325. https://doi.org/10.1007/
s10648-005-8136-3.
Vaughan, N., & Garrison, D. R. (2005). Creating cognitive presence in a blended faculty development
community. The Internet and Higher Education, 8(1), 1–12. https://doi.org/10.1016/j.iheduc.2004.11.
001.
Veerman, A., & Veldhuis-Diermanse, E. (2001). Collaborative learning through computer-mediated
communication in academic education. Paper presented at the Euro CSCL 2001, Maastricht: McLuhan
institute, University of Maastricht.
Weinberger, A., & Fischer, F. (2006). A framework to analyze argumentative knowledge construction in
computer-supported collaborative learning. Computers & Education, 46(1), 71–95. https://doi.org/10.
1016/j.compedu.2005.04.003.
Wu, B., Wang, M., Grotzer, T. A., Liu, J., & Johnson, J. M. (2016). Visualizing complex processes using a
cognitive-mapping tool to support the learning of clinical reasoning. BMC Medical Education, 16(1),
216. https://doi.org/10.1186/s12909-016-0734-x.

123
Graphic organizers in online discussions 1503

Kyungbin Kwon is an Assistant Professor in Instructional Systems Technology, Indiana University,


Bloomington, IN, USA.

Suhkyung Shin is a Research Professor in the Office of Educational Innovation at University of Seoul, Seoul,
Korea.

Su Jin Park is a Doctoral Candidate in Literacy, Culture, and Language Education, Indiana University,
Bloomington, IN, USA.

123

You might also like