You are on page 1of 14

Computers & Education 59 (2012) 1122–1135

Contents lists available at SciVerse ScienceDirect

Computers & Education


journal homepage: www.elsevier.com/locate/compedu

Comparing the social knowledge construction behavioral patterns of


problem-based online asynchronous discussion in e/m-learning environments
Yu-Feng Lan a, *, Pei-Wei Tsai a, Shih-Hsien Yang b, Chun-Ling Hung c, d
a
Department of Information Management, National Formosa University, No. 64, Wunhua Road, Huwei Township, Yunlin County 632, Taiwan
b
Department of Applied Foreign Languages, National Formosa University, No. 64, Wunhua Road, Huwei Township, Yunlin County 632, Taiwan
c
Department of Industrial Education and Technology, National Changhua University of Education, No. 1, Jin-De Road, Changhua City 500, Taiwan
d
Department of International Business Administration, Chienkuo Technology University, No. 1, Chiehshou, North Road, Changhua City 500, Taiwan

a r t i c l e i n f o a b s t r a c t

Article history: In recent years, researchers have conducted various studies on applying wireless networking technology
Received 30 November 2011 and mobile devices in education settings. However, research on behavioral patterns in learners’ online
Received in revised form asynchronous discussions with mobile devices is limited. The purposes of this study are to develop
8 May 2012
a mobile learning system, mobile interactive teaching feedback system (MITFS), linked to both mobile
Accepted 9 May 2012
devices and the internet, to support learners with online asynchronous discussion, and combine content
analysis and sequential analysis to compare and contrast the social knowledge construction behavioral
Keywords:
patterns of problem-based asynchronous discussion in e-learning and m-learning environments. This
Computer-mediated communication
Evaluation methodologies study investigated four weeks of online discussions in an “Introduction to Computer Science” course
Teaching/learning strategies involving forty first year university students. The control group (online asynchronous discussion without
mobile devices) and the experimental group (online asynchronous discussion with mobile devices) in the
group discussions were explored. By using content analysis and sequential analysis for the problem-
based online asynchronous discussion of the behavioral patterns and differences between students in
control and experimental group, the results showed that using mobile devices in online asynchronous
discussion influenced students’ learning performance. Some interesting results were found. Firstly, when
the students used mobile devices in discussion situations, they could more engage in reflecting thinking,
sharing more information, and further facilitating social knowledge construction among group members.
Secondly, the experimental group performed better than the control group in terms of participation and
diversity in knowledge construction behavioral patterns. Finally, based upon the findings, some impli-
cations are proposed for further research.
Ó 2012 Elsevier Ltd. All rights reserved.

1. Introduction

Problem-based learning (PBL) is an instructional strategy that is organized around the investigation and resolution of problems (Trop &
Sage, 2002). The learning activity is grounded in a general problem which has multiple possible solutions and methods of addressing the
problem. According to the assigned problem, students can collect information, discuss opinions or experiences with their peers, and propose
solutions (Hou, Chang, & Sung, 2008). Moreover, in a PBL environment, researchers have argued that learners can obtain several benefits in
terms of peers’ interaction, learning performance, communication skills, individual constructions of knowledge, and teamwork cooperative
skills, especially when the entire process is conducted by teams and not by individuals (Keating & Gabb, 2006).
Several strategies of PBL have been reported in literature (Hou et al., 2008; Kelson & Distlehorst, 2000; Schmidt & Moust, 2000). One of
the dominant strategies is group discussion. To facilitate group discussion, computer-mediated communication (CMC) has been used as
a tool in e-learning environments to extend learners’ interaction beyond time and space limitations. Generally, the CMC tools can be divided
into two types based on their communication method. The first is asynchronous communication, such as online discussion forums, and the
second is the synchronous communication, such as instant messaging (IM) (Hou & Wu, 2011). Each of these two types of CMC tools has its

* Corresponding author. Tel.: þ886 5 6315745; fax: þ886 5 6364127.


E-mail addresses: yflan@nfu.edu.tw (Y.-F. Lan), oneesancon@yahoo.com.tw (P.-W. Tsai), shiyang@nfu.edu.tw (S.-H. Yang), hongjl@cc.ctu.edu.tw (C.-L. Hung).

0360-1315/$ – see front matter Ó 2012 Elsevier Ltd. All rights reserved.
doi:10.1016/j.compedu.2012.05.004
Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135 1123

own characteristics and limitations in discussion applications (Branon & Essex, 2001; Hou & Wu, 2011; Johnson, 2006). Online asynchronous
discussion forums have been extensively applied in educational settings, and several studies have reported the benefits of such learning
tools (Cheng, Paré, Collimore, & Joordens, 2011; Hou et al., 2008; Shana, 2009; Vonderwell, 2003). For instance, by supporting extensive
interactions outside the classroom environment, asynchronous discussion forums can effectively enhance the process of acquiring, sharing,
and exchanging knowledge among students, and further improve learning outcomes and performance (Cheng, Paré, Collimore, & Joordens,
2011; Leidner & Jarvenpaa, 1995). Furthermore, using asynchronous communication tools allows students more time to respond to
discussion topics and reflect on their contribution (Hou & Wu, 2011; Sabau, 2005).
In recent years, many studies have investigated the effects of online asynchronous discussion on learners’ interaction and behavior.
Efforts to explore learners’ online discussion behavior include gathering quantitative data about levels of participation (Benbunan-Fich &
Hiltz, 1999; Cheng et al., 2011; Harasim, 1993) and content analysis (Lally & De Laat, 2003; Lee & Tsai, 2011; Louca, Druin, Hammer, &
Dreher, 2003; Stahl, 2002). On the other hand, researchers used different approaches to analyze and explain the implications of the
gathered information. For instance, researchers used the transcripts of online discussions to investigate and analyze the process of the social
construction of knowledge (Gunawardena, Carabajal, & Lowe, 2001; Gunawardena, Lowe, & Anderson, 1997), components of collaborative
learning (Curtis & Lawson, 2001; Johnson & Johnson, 1996), critical thinking (Newman, Webb, & Cochrane, 1995), and patterns of social
interaction between learners (Heo, Lim, & Kim, 2010). According to different theoretical viewpoints, researchers provided a comparison
between similar content analysis schemes (Wever, Schellens, Valcke, & Keer, 2006). In this study, the focus is on the social knowledge
construction behavioral patterns of problem-based asynchronous discussion.
To further understand a learner’s level of knowledge construction during problem-based online asynchronous discussion, a coding
scheme interaction analysis model (IAM) was proposed by Gunawardena et al. in 1997, and widely used to evaluate the level of knowledge
construction during online discussions (Heo et al., 2010; Hou & Wu, 2011; Hou et al., 2008; Jeong, 2003). However, Hou et al. (2008) argued
that these methods are unable to infer the overall sequential pattern of online discussion. Consequently, in 2011, they combined IAM model
and an application of sequential analysis to explore sequential relationships between each type of coded discussion content in the online
synchronous discussion, and inferred why the learner stopped the discussion or came to a conclusion hastily. As a result, the level of
knowledge construction behavioral patterns and the characteristics of students’ problem-based online discussions can be evaluated, while
also allowing for statistical analysis of the sequences of all participants’ discussion behaviors in the group members within a certain time
span (Hou & Wu, 2011).
A learning environment is called m-learning if learners could use mobile devices to access learning materials and to support their
learning activities, anytime and anywhere. In recent years, using mobile devices for learning has become progressively more popular, and
mobile learning strategies have been discussed extensively (Chang et al., 2011; Chen & Huang, 2010; Cheng, Hwang, Wu, Shadiev, & Xie,
2010; Hwang & Chang, 2011; Sandberg, Maris, & Geus, 2011; Sung, Hou, Liu, & Chang, 2010). Compared to studies of asynchronous
discussion in e-learning, the research on social construction of knowledge and behavioral pattern analysis for m-learning is sparse. Even
though some researchers have examined learners’ participation in collaborative discussion with mobile devices (Vavoula, Sharples,
Rudman, Meek, & Lonsdale, 2009; Wong, Chin, Tan, & Liu, 2010), the issue of exploring the processes and behavioral patterns of asyn-
chronous discussion in an m-learning environment has attracted relatively little attention.
Therefore, research on behavioral patterns in asynchronous discussion activities using mobile devices is needed to further investigate the
effectiveness and limitations of all aspects of social knowledge construction in m-learning communities. Through empirical observations
and analysis of behavioral patterns and social knowledge construction in asynchronous discussions, more detailed processes of asyn-
chronous discussion activities, supported by mobile devices, can be better understood. Based on the above research motivation, the
purposes of this study are to develop a mobile system to support asynchronous discussion and combine content analysis and sequential
analysis, to compare and contrast the social knowledge construction behavioral patterns of problem-based asynchronous discussion in e-
learning and m-learning environments. Four research questions are described as follows:

1. Does online asynchronous discussion activity supported by mobile devices enable the students to obtain better learning performance?
2. During online asynchronous discussion activity, what are the characteristics of the students’ behavioral patterns, in the social
construction of knowledge in the control group (without mobile devices) and experimental group (with mobile devices)?
3. What are the different behavioral patterns in the social construction of knowledge between the control group and experimental group?
4. What are the students’ attitudes toward the online asynchronous discussion with the support of mobile devices in problem-based
learning?

2. Theoretical background

2.1. Social construction of knowledge in problem-based learning

In PBL, students with different levels of knowledge and experience work together in small groups toward a common goal which is
pertinent to their real contexts (Slavin, 1995). The students are not only able to collaboratively solve problems and reflect on their expe-
riences, but also able to co-construct knowledge through social interaction and peer assistance in collaborative PBL. Further, in the PBL
environment the instructors need to design appropriate problems related to the learning objective. According to the designed problems, the
students are encouraged to take responsibility for their group, and organize and direct the learning process to solve the problems (Boud &
Feletti, 1991; Savoie & Hughes, 1994; Trop & Sage, 2002).
PBL has several distinct characteristics, which may be identified and utilized in designing a curriculum. These are: (a) learning is driven
by challenging, open-ended, ill-defined and ill-structured problems, (b) students generally work in collaborative groups, (c) teachers work
as facilitators, coaches, helpers or consultants, and (d) students are only given guidelines for how to approach problems, and there is no one
formula for them to solve the problems (Arámbula-Greenfield, 1996; Kelson & Distlehorst, 2000; Schmidt & Moust, 2000; Trop & Sage,
2002). That challenges students to work collaboratively in groups to seek solutions to real world problems. Teachers must allow
students to determine on their own what they need to know, and to learn through the study of various resources. Although PBL might be
1124 Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135

implemented either in individual or in collaborative contexts, considering the notion of social knowledge construction, there is no doubt
that learners are likely to achieve better outcomes from PBL in collaborative contexts.

2.2. Problem-based learning in online asynchronous discussions

Online asynchronous discussions have been used in a wide range of formal educational settings. Previous research has also demonstrated
learning benefits for students in online asynchronous discussion activities. For example, researchers indicate that such learning activities
can promote learners’ higher-order thinking skills and problem-solving abilities (Hew & Cheung, 2008; Shapley, 2000), improve students’
interaction and cooperation (Hew & Cheung, 2003; Shana, 2009), and achieve higher levels of knowledge construction (Hou et al., 2008).
Compared to traditional face-to-face discussions, online asynchronous communication provides learners with more time and opportunities
to understand peers’ ideas, exchange personal experience, motivate thinking, create their own responses, and facilitate deeper reflections
(Ajayi, 2009; Hew & Cheung, 2008). In view of this, the combination of problem-based learning and online asynchronous discussion can
establish an environment of problem-based online discussion activity. As a requirement of the PBL activities, instructors define the problem
and guide students to think deeper, reflect further, and conduct more research.

2.3. Using mobile devices to support discussion activity

The rapid developments of wireless networking technology and mobile devices have given rise to the emergence of mobile learning
environments. Various mobile devices have been used in m-learning (Sharples & Beale, 2003). Researchers have found that using mobile
devices supports discussion activities and offers several benefits. These include the ability to: (a) promote information collection and
exchange, (b) improve communication and collaborative interaction, (c) encourage active learning, (d) enhance the learner’s feedback
process, and (e) acquire content quickly (Huang, Jeng, & Huang, 2009; Pownell & Bailey, 2000; Roschelle, 2003; Vavoula et al., 2009; Wong
et al., 2010; Yang & Lin, 2010; Zurita & Nussbaum, 2004). For example, Huang et al. (2009) presented a mobile blogging system that provides
the potential for collaborative interaction and learning opportunities for geographically dispersed persons and groups. Vavoula et al. (2009)
presented an evaluation of a Myartspace website, a service on mobile phones for inquiry-based learning, which allows learners to gather
information during school visits to museums. Yang and Lin (2010) presented a tablet computer together with personal digital assistant
under the discussion and learning activity design, for improving learners’ abilities on classifying plants. Wong et al. (2010) presented
a mobile, with assisted language learning, which emphasizes a learner’s use of smart phones to capture photos of real-life contexts per-
taining to Chinese idioms, along with in-class or online sharing and discussions on the contexts that took place. All above examples
demonstrate that using mobile devices can support various types of discussion and learning activities.
In summary, each of the previously mentioned topics such as problem-based learning, online asynchronous discussion, and integrating
mobile devices into learning, supports the idea of engaging students in the process of knowledge construction and problem-based online
discussion strategies.

3. Methodology

This research combined both quantitative content analysis and sequential analysis, to compare the social knowledge construction
behavioral patterns of problem-based asynchronous discussion in e-learning and m-learning environments. The recorded content of
students’ problem-based online discussions was coded with the IAM. According to the coded data, content analysis and sequential analysis
were conducted to infer the behavioral pattern of online asynchronous discussion. Several researchers have suggested that the utilization of
such a combined method can provide a greater foundation of inference toward explaining students’ knowledge construction behavior
patterns (Heo et al., 2010; Hou et al., 2008; Hou, Sung, & Chang, 2009; Hou & Wu, 2011).

3.1. Participants

The participants of this experiment were 40 first year university students (20 males and 20 females) attending an “Introduction to
Computer Science” course in Taiwan. All participants were randomly assigned to two groups: the control group (problem-based online
asynchronous discussion activity without mobile devices support) and the experimental group (problem-based online asynchronous
discussion activity with mobile devices support), each of which contained 20 students. The students from each group were then divided into
several small discussion groups with each subgroup comprising five students, as shown in Table 1. The teacher for each of the two groups
was the same throughout the study.

Table 1
Distribution of the students for each group.

Group Numbers of student


Control group C1 5 males
C2 1 male and 4 females
C3 3 males and 2 females
C4 1 male and 4 females

Experimental group E1 5 males


E2 1 male and 4 females
E3 2 males and 3 females
E4 2 males and 3 females
Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135 1125

Before the experiment, each of the students in the experimental group was assigned an HTC Touch CRUISE 09 smartphone, with built-in
Wi-Fi access, GPS receiver, and camera. Fifteen mobile devices were provided by the researchers and the others were supported by the
participants. All had prior training and were capable of utilizing mobile devices.
In this study, the “Introduction to Computer Science” course focused on introducing software and hardware conceptions. The software
part of the course provided instruction in terms of how to control computers and how to process information to meet individual or
organization needs. The hardware part of the course introduced the characteristics of a component and how to use these characteristics to
construct a computer.

3.2. Design of the learning activity

The design of the learning activity was to investigate the assistance of using mobile devices to support problem-based learning in an
online asynchronous discussion environment. To neutralize the Hawthorne effect, the study attempted to let all participants do not know
what’s being measured regarding their learning performances and behaviors. All participants were told the learning activities aimed to
enhance their knowledge sharing and problem-solving skills. This would focus their attention on learning goals, rather than on what they
used to solve the assigned problems. At the beginning of the experiment, the instructor used the general methodology to lecture the course.
That is, the instructor introduced foundational concepts related to the learning topics, aiming to assist students to construct a prior
knowledge. To achieve PBL requirements, the instructor further designed proper problems (tasks) to encourage students to acquire new
knowledge by themselves through information processing and reasoning.
However, using only PBL to help learners to construct knowledge may not be enough. For example, students may have faith that
something would prove to be right or useful in spite of the absence of proof or evidence. To make a correct judgment regarding a posted
discourse and engage in deeper understanding of a given problem, in this study the students were guided by the instructor to justify their
constructions of knowledge or understanding during the learning activities. More specifically, all participants were guided by the following
principles to complete the learning activities: (1) finding learning resources, (2) making logical inferences, (3) offering opinions with
reasons, (4) comparing and evaluating evidences, (5) asking relevant questions and seeking answers, (6) making criteria-based judgments,
(7) making evidence-based decisions, and (8) reflexivity. In addition, the instructor checked the students’ progress as well as learning status
and provided weekly feedback.
For the designed learning activity, each group had to complete four problem-based learning tasks according to the assigned problem
scenario (see examples in Appendix). Each task took one week and was designed to include reflecting learning topics, identifying problems,
collecting data, and exchanging ideas. A task was announced each week, with the students from each group asked to conduct collaborative
discussions on the task and exchange ideas until they reached a conclusion.
During the problem-solving process, all students can enter the forum, browse a list of all discussion topics, enter the page of a specific topic
by clicking the title hyperlink, and then post their responding messages, which included various answers, information, discussion, and
conclusions. The students in the experimental group can use mobile devices to support their discussion activity for m-learning. All the content
captured or posted was recorded and sent by network communications from the mobile devices to the discussion forums for later analysis.

3.3. The proposed online asynchronous discussion system

To understand and measure the students’ online discussion behavior, an online asynchronous discussion system must be provided for
teaching implementation, and the system must also have a mechanism to record the process of the entire discussions. The proposed system,
called mobile interactive teaching feedback system (MITFS) applies to both mobile devices (smartphone) and a website to support learners
for online discussion. More specifically, the system provides two interaction interfaces, including mobile and web interfaces, and associated
technologies as shown in Fig. 1.
For the mobile system, Windows Mobile 6.1 Professional was used as its operating system, and combined various features such as
TouchFLO, Wi-Fi, GPS mobile navigation, camera, video camera, voice recorder and media player. The web system used Microsoft Windows
Server 2003 as its operating system, and IIS 5.0 as its WWW (World Wide Web) server and Microsoft SQL Server 2005 as its database server.
The hardware specifications of the web system were CPU 3.4 GHz, 2 GB RAM and 200 GB HD space. The maximum number of simultaneous
users of the system could be more than one hundred, generally, depending on the Web Server specifications and network bandwidth. The
main system development tools included Microsoft Visual Studio 2008 and ASP.net 3.5. For synchronous information representation, data
was exchanged between two systems through the Web services. Learners were able to upload photos, discourse searching, and browse all
posted topics, as well as create new topics via mobile devices or PC (see Fig. 2 and Fig. 3). For example, MITFS allows users to log in to the
system for taking a picture and uploading it, as shown in Fig. 2(a,b). By clicking on a topic link, students can enter the topic page to view the
discussion messages and post responses.
The proposed system was designed and based on mobile technology providing five affordances: a capture tool, representational tool,
multimedia-access tool, connectivity tool, and analytical tool (Churchill & Churchill, 2008). The environment mainly enables students to
share information and collaborate in discussion activities via the Internet/wireless network. The following five perceived affordances of
mobile technology were explicated in this study:

C Capture tool: Technology is equipped with capture capabilities that include capture of photographs. The learners can observe the real
world situations and use smart phones with cameras to photograph and transmit pictures via wireless networks as shown in Fig. 2(c).
C Representational tool: Technology might be used by learners to create representations which demonstrate their experience, ideas and
thinking. The learners can post and create relevant discussion topics to discuss with each other as shown in Fig. 2(d).
C Multimedia-access tool: A variety of multimedia resources can be delivered using this technology. The learners can view the resources
of discussion messages via photograph mode or audio mode as shown in Fig. 2(e,f).
C Connectivity tool: Technology empowers learners to connect to each other in the field, and exchange ideas and files. The learners can
share the information and their opinion or thinking in discussion messages via e-mail, as well as, the website shown in Fig. 2(g).
1126 Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135

Fig. 1. The architecture of the MITFS system.

C Analytical tool: Technology might be used as an analytical tool to aid learners’ tasks. The learners can give feedbacks and comments for
each discussion topic through peer assessment as shown in Fig. 2(h).

3.4. Procedure

The experimental procedure consisted of three stages, including the training phase, problem-based online asynchronous discussion
phase, and evaluation phase. After the experiment, this study analyzed the students’ learning performance, discussion content, and learning
satisfaction. Fig. 4 shows the procedure of the experiment.

Fig. 2. MITFS mobile interfaces: (a) login menu, (b) main menu, (c) taking picture, (d) Discussion, (e) browsing picture, (f) listening sound, (g) email, and (h) peer assessment.
Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135 1127

Fig. 3. A webpage of the discussion topics display in forums.

In the first stage, the teacher introduced the learning activity which included how to achieve online discussion learning and a pre-test,
which evaluated students’ prior knowledge.
In the second stage, the students in the control group primarily posted the discourses through the proposed web system to the learning
platform. In contrast, apart from using the web system to submit the discourses, the students in the experimental group should have been
familiar with taking photos, recording audio and inputting texts through mobile devices in the proposed system. According to the designed
learning activities, each group was asked to research an important computer science and information technology development. After
finishing the learning activity each week, each group was required to prepare a PowerPoint presentation according to their discussion effort.
The report, based on the problem-based learning tasks (see Appendix), presented the contents of the topic in the classroom that measured
their learning gain.
In the last stage, the students in the experimental group were required to complete a questionnaire, indicating their perception of using
the system.

3.5. Data collection and analysis

Data was collected from multiple sources, including PowerPoint presentation scores, the logs of online discussion forums, and students’
learning satisfaction questionnaires.

3.5.1. The students’ learning performance indicators for rating scales


This study used evaluation scales to measure the quantity of students’ work as depicted in Table 2. Two experts participated in the
process of evaluation to measure the briefing. To prevent scoring bias, two experts gave scores on the basis of the assessment criteria,
without being told which was the experimental group or the control group. The results showed that the Pearson’s product–moment
1128 Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135

Fig. 4. Procedure of the experiment.

Correlation of the two experts’ scoring for experimental group and control group were 0.92 and 0.95, respectively (p < 0.01). The final item
score was the average score graded by two experts.

3.5.2. The coding scheme for the content analysis in discussion activities
Regarding the social knowledge construction coding items of the coding scheme, the present study adopted the Gunawardena’s et al.
(1997) IAM, coding items which have already been used in previous research (Heo et al., 2010; Hou et al., 2008; Hou & Wu, 2011; Jeong,
2003; Marra, Moore, & Klimczak, 2004; Sing & Khine, 2006). Most importantly, Rourke and Anderson (2004) suggested that using the
coding scheme, proven by numerous researchers, will help to increase the validity of the content analysis. In Table 3, each code represents
a discussion behavior. To ensure inter-rater consistency, all discussion content was then coded based on the knowledge construction coding
scheme by two experts.
The discussion between the 40 students over the four weeks yielded over 763 codes. The inter-rater Kappa reliability value for the control
group was 0.652 (p < 0.01), and the Kappa reliability value for the experimental group was 0.638 (p < 0.01). The sets were statistically
significant, and the coded data was then put through knowledge construction content analysis and sequential analysis. The discussion then
underwent quantitative content analysis of behaviors to allow us to understand and discuss the behavioral patterns.

3.5.3. Questionnaire items for students’ learning satisfaction


The effectiveness of using mobile devices in support of learning activities on students’ learning satisfaction was conducted by ques-
tionnaire. The responses to each question in the questionnaire were designed by using a 5-point Likert-scale, in which 5 stands for “strongly
agree” and 1 stands for “strongly disagree”. The present study adapted the questionnaire by Yang and Lin (2010) along with the ques-
tionnaire items shown in Table 4. SPSS statistic software was used to analyze the results of the questionnaire, with the reliability of the
questionnaire (using Cronbach’s alpha) 0.818. Analysis of the considered sample showed a reasonable level of reliability (alpha > 0.70).

4. Results

4.1. The influence of different online asynchronous discussion strategies on students’ learning performance

As mentioned before, this study randomly assigned 20 students to a group that utilized online asynchronous discussion with mobile
devices, and the other 20 students to a group that emphasized online discussion without mobile devices. To examine if prior knowledge of

Table 2
Rating scales for briefing.

Learning performance indicators Description


The appearance of content (40%) To evaluate the level of the content presentation toward the richness, readability, relevance, and usefulness.
The skill of design and completion (30%) To evaluate the level of content design toward the correctness and completion.
The presentation of oral report (30%) To evaluate the level of oral report toward clarity and fluency.

Adopted from Chang et al. (2011): The study on integrating WebQuest with mobile learning for environmental education.
Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135 1129

Table 3
The coding scheme for the content analysis of knowledge construction.

Dimension Code Phase Description


KC 1 (Academic-related) KC 1-1 Sharing and comparing of information Presenting new information to team members; a statement of
observation or opinion.
KC 1-2 The discovery and exploration of dissonance or inconsistency Identifying areas of disagreement; asking and answering
among ideas, concepts or statements questions to clarify disagreement.
KC 1-3 Negotiation of meaning or co-construction of knowledge Negotiating meanings of terms and negotiation of the relative
weight to be used for various agreements.
KC 1-4 Testing and modification of proposed synthesis or co-construction Testing the proposed new knowledge against existing cognitive
schema, personal experience or other source.
KC 1-5 Agreement statement / applications of newly-constructed meaning Summarizing agreements and meta-cognitive statements that
show new knowledge construction.
KC 2 (Off-topic) KC 2-1 Contents irrelevant to the learning task A contents that is completely irrelevant to the learning
discussion task.

computer science was significantly different between the groups, a self developed question was adopted as a prior knowledge test
(multiple-choice measure of computer science: there are 40 questions and each question has 4 choices), and an independent T-test was
conducted. Comparing the achievement of the control group (M ¼ 63.75, SD ¼ 6.95) and the experimental group (M ¼ 62.50, SD ¼ 7.76), the
results showed no significant difference between the test scores of the two groups (t ¼ 0.494, p ¼ 0.624); that is, the two groups of students
had the equivalent knowledge concerning computer science before participating in the learning activity.
After the experiment, different discussion strategies made an impact on the students’ learning performance. Using an independent T-
test, Table 5 illustrates the two groups’ performance indicators and the differences between them. From the PowerPoint presentation scores,
it was found that the students in the experimental group had significantly better achievements than those in the control group (t ¼ 5.37,
p < 0.001). More specifically, in the appearance of content, compared with the achievement of the control group (M ¼ 25.00, SD ¼ 1.79), the
experimental group (M ¼ 32.94, SD ¼ 2.29) showed a significant difference (t ¼ 10.91, p < 0.001). However, both the skill of design,
completion and the presentation of oral report, compared with the achievement of the control group, the experimental group showed no
significant difference on performance indicators.

4.2. Analysis of knowledge construction in online discussions for two groups

As shown in Table 6, two assessors performed the coding according to the content of each message. When one discourse contained more
than two codes, the codes were arranged in order of time. For instance, if the first paragraph of a discourse was KC 1-1 and the second
paragraph was KC 1-3, this discourse was then coded as KC 1-1, KC 1-3. After messages from all topics were coded, based on the above
method, each topic was given a set of knowledge construction codes.
Overall, the students in the control group not only shared information (KC 1-1), but also identified areas of disagreement and clarified
goals and strategies (KC 1-2). They also conducted negotiations and achieved the co-construction of knowledge (KC 1-3). Interestingly, the
students in the experimental group shared information and they achieved the co-construction of knowledge in most cases (KC 1-1 and KC 1-
3). In addition, the control group produced 8% of “testing and modification of proposed synthesis or co-construction” (KC 1-4) contents,
while the experimental group produced 10%, whereas, “agreement statement/applications of newly-constructed meaning” (KC 1-5)
occupied approximately 10% of the entire discussion (see Table 6).
More specifically, the result indicates that the experimental group shared more information and achieved more co-construction of
meaning among group members, while the control group conducted fewer negotiations and less co-construction occurred.
The frequency and percentage of each code for the control group is shown in Fig. 5. From the distribution of the codes, this study found
that “sharing and comparing of information” (KC 1-1) was highest (34%), whereas, “discovery and exploration of dissonance or inconsistency
among participants” (KC 1-2) and “negotiation of meaning or co-construction of knowledge” (KC 1-3) were 20% and 22% of the entire
discussion. It is notable that “testing and modification of proposed synthesis or co-construction” (KC 1-4) and “agreement statement/
applications of newly-constructed meaning” (KC 1-5) only occupied approximately 8% and 10%.
In contrast, for the experimental group the frequency and percentage of knowledge constructed by website, the percentage of “sharing
and comparing of information” (KC 1-1) was the highest (32%), whereas “negotiation of meaning or co-construction of knowledge” (KC 1-3)
was 31%. It is notable that “testing and modification of proposed synthesis or co-construction” (KC 1-4) and “agreement statement/
applications of newly-constructed meaning” (KC 1-5) occupied approximately 12% each, whereas “discovery and exploration of dissonance
or inconsistency among participants” (KC 1-2) only occupied approximately 6% of the entire discussion, as shown in Fig. 6.
To understand the effects of using mobile devices on online discussion activities, for the experimental group the level of knowledge
constructed by mobile devices is shown in Fig. 7. Almost all of the students (80%), engaged in “sharing and comparing of information” (KC 1-

Table 4
Results of the questionnaire.

Questions Mean Point  4


1. I referred to information on the system for proof of my explanation to group members. 4.05 75%
2. I received the system messages quickly. 4.25 80%
3. Group discussions were facilitated with a common focus on the system. 4.20 80%
4. Group discussions were interesting through the use of the system. 4.35 90%
5. Controlling both the forum and the system were convenient. 4.10 80%
6. I am willing to use the system in other courses. 4.30 80%
1130 Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135

Table 5
The T-test results of the performance indicators of different online discussion strategies.

Performance indicators Group Mean SD T-value (P-value)


PowerPoint presentation Experimental 74.63 4.70 5.37*** (0.000)
Control 67.06 3.11
The appearance of content Experimental 32.94 2.29 10.91*** (0.000)
Control 25.00 1.79
The skill of design and completion Experimental 22.19 2.51 0.39 (0.698)
Control 22.50 1.97
The presentation of oral report Experimental 19.50 3.01 0.07 (0.942)
Control 19.56 1.55
* ** ***
p < 0.05, p < 0.01, p < 0.001.

1), whereas “discovery and exploration of dissonance or inconsistency among participants” (KC 1-2) and “negotiation of meaning or co-
construction of knowledge” (KC 1-3), only occupied approximately 3% and 14% of the entire discussion. KC 1-4 and KC 1-5, the higher
dimensions of social knowledge construction were not found in this case.

4.3. Sequential analysis for online discussion between groups

According to the above quantitative content analysis, this study obtained an initial result of the characteristics of knowledge construction
in asynchronous discussions between groups. The coded strings were then organized chronologically and a sequential analysis was carried
out on the strings.
After the sequential analysis, this study summarized the frequency of each behavioral type in succession, and was able to determine
whether a behavioral sequence of one discussion behavior followed another was significant. As shown in Table 7, each row represents an
initial behavior and each column represents a follow-up behavior. The numbers in the table represent the total number of times a column
behavior follows another, immediately after a row behavior (for example, the number 18 in row 1, column 3 means that “KC 1-3 occurring
immediately after KC 1-1” happened 18 times).
To continue the deduction on sequential relationship and determine whether they reach statistical significance, this study conducted
a sequential analysis of the data listed in Table 8. When the z-score is greater than 1.96, it means the sequence of a row and column is
statistically significant (p < 0.05) (Bakeman & Gottman, 1997). After the analysis, the six significant sequences were then compiled to form
the behavioral transition diagram in Fig. 8.
Fig. 8 presents all the sequences in Table 8 that reached a level of significance, and each arrow points in the direction of the transfer. Data
shown in Table 7 and Fig. 8 provide the pattern of behaviors that occurred during online discussions. From the data, the study derived the
following patterns of behavioral transfer, KC 1-1 / KC 1-1, KC 1-1 / KC 1-3, KC 1-1 / KC 1-4, KC 1-1 / KC 1-5, KC 1-2 / KC 1-1, and KC 1-
3 / KC 1-3, where each sequence reached significance during online discussions.
Compared with the experimental group, 425 codes were obtained after coding. The results of the frequency transition and sequential
analysis are shown in Table 9 and Table 10, respectively. There were 8 significant sequences in the experimental group, and the study
deduced a behavioral transfer diagram based on Table 10 (as shown in Fig. 9). The sequences that reached significance during online
discussions were KC 1-1 / KC 1-1, KC 1-1 / KC 1-3, KC 1-1 / KC 1-4, KC 1-2 / KC 1-1, KC 1-3 / KC 1-1, KC 1-3 / KC 1-3, KC 1-3 / KC 1-
5, and KC 1-4 / KC 1-5.
In terms of knowledge construction, the results showed that the control group lacked KC 1-3 / KC 1-1, KC 1-3 / KC 1-5, and KC 1-
4 / KC 1-5. However, the experimental group demonstrated the knowledge construction pattern of KC 1-1 / KC 1-3 and KC 1-3 / KC 1-1.
This finding indicated that the experimental group had more diverse knowledge construction in their discussions, and the significant
difference helped us understand the order of knowledge construction between control and experimental groups.

4.4. Evaluation of students’ learning satisfaction for m-learning environment

Twenty valid copies of the questionnaire were collected after the experiment. Table 4 shows the results of the questionnaire with the
mean scores for all 6 questions. The results show that students evaluated high scores in each category of the questionnaire.
The results also revealed that the system could help learners read for relevant information quickly. Students further hoped they could use
it in other disciplines. In regards to collaboration supported by the system, most students agreed that the system could support information
sharing and group discussion. Moreover, the findings on students’ learning attitude and impressions of the learning activity showed that
they were willing to use the system in other courses. The result demonstrates superior ability to promote communication, collaborative
interaction, and discussion experiences.

5. Discussion

Several pedagogical implications can be drawn from this study. First, from the report content regarding PowerPoint presentation and
online discussion between groups, this study found the experimental group paid more attention to their surroundings associated their daily

Table 6
The frequencies of coded discussion behaviors in control and experimental groups.

Group KC 1-1 KC 1-2 KC 1-3 KC 1-4 KC 1-5 KC 2-1 Total


Control 112 (34%) 68 (20%) 75 (22%) 28 (8%) 35 (10%) 20 (6%) 338
Experimental 175 (41%) 22 (5%) 118 (28%) 42 (10%) 40 (9%) 28 (7%) 425
Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135 1131

KC 2-1, 20, 6%

KC 1-5, 35, 10%


KC 1-1, 112, 34%

KC 1-4, 28, 8%

KC 1-3, 75, 22%

KC 1-2, 68, 20%

Fig. 5. Distribution of the codes for control group.

KC 2-1, 25 , 7%

KC 1-5, 40 , 12% KC 1-1, 105 , 32%

KC 1-4, 42 , 12%

KC 1-2, 20 , 6%

KC 1-3, 106 , 31%

Fig. 6. Distribution of the codes for experimental group from website.

KC 2-1, 3 , 3%

KC 1-3, 12 , 14%

KC 1-2, 2 , 3%

KC 1-1, 70 , 80%

Fig. 7. Distribution of the codes for experimental group from mobile devices.

Table 7
Results of frequency transition in the control group.

KC 1-1 KC 1-2 KC 1-3 KC 1-4 KC 1-5 KC 2-2


KC 1-1 49 10 18 16 17 2
KC 1-2 22 11 11 7 8 5
KC 1-3 11 13 33 3 4 11
KC 1-4 9 12 3 1 2 1
KC 1-5 10 12 7 1 4 1
KC 2-2 7 10 3 0 0 0
1132 Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135

Table 8
Results of sequential analysis for behaviors in the control group.

KC 1-1 KC 1-2 KC 1-3 KC 1-4 KC 1-5 KC 2-2


KC 1-1 13.23 0.24 2.90 2.24 2.57 2.42
KC 1-2 4.24 0.57 0.57 0.76 0.43 1.42
KC 1-3 0.57 1.24 7.90 2.09 1.76 0.57
KC 1-4 0.09 0.91 2.09 2.76 2.42 2.76
KC 1-5 0.24 0.91 0.76 2.76 1.76 2.76
KC 2-2 0.76 0.24 2.09 3.09 3.09 3.09

The bold character presents that the z-score is greater than 1.96, indicating the value reaches a level of significance (p < 0.05).

Fig. 8. The behavioral transition diagram of control group.

experiences (e.g., students frequently and effectively incorporate observable computer hardware into their feedbacks regarding discussion
contents). In other words, when comparing their artifacts while learning in the online asynchronous discussion, it was found that students
in the experimental group created more valuable learning course materials in terms of the richness, readability, relevance, and usefulness
than those in the control group. However, the two groups lacked the ability to express opinions and to make a report about their discussion
process; therefore, teachers can help students to foster diversified learning skills such as an oral report in PBL activity.
Second, Table 6 demonstrates that during the four weeks discussion period, the volume of codes in the experimental group (425 codes)
was more than that of the control group (338 codes), indicating that the students in the experimental group seemed to have more learning
motivation to constantly participate in discussion and had more frequent discussions. The reasoning may be that (1) students who can
contribute via mobile devices conveniently have more time available for learning activities (they do not have to wait until they are beside
their PC), and (2), using mobile devices outside the classroom for learning has excited and interested many participants. Interestingly,
regarding the level of knowledge constructed by the support of mobile devices, the students were more intent on collecting and sharing
information to solve problems. The result implied that using mobile devices can help students acquire contents quickly and further improve
content richness, which was consistent with the researchers’ argument that mobile technology provides users with two important features
in m-learning, ubiquitous mobility and situated context (Jeng, Wu, Huang, Tan, & Yang, 2010). In this study, the mobile device was a key
element in supporting information collection and knowledge construction, such as taking appropriate pictures in contexts to illustrate the
discussions topic. Initially, students often attempted to collect more information and share with peers to facilitate the entire process of
discussions, rather than spending extra time to transform their perception into in-depth understanding behind their ideas (see Fig. 7).
Gradually, the use of photo-taking learning became one of the most popular learning activities, focusing on gathering resources. Further
discussion behavior and knowledge construction processes took place when they returned to use a PC. According to the discussion above,
the findings inferred that when combining mobile devices with a PC in online discussion processes, students were likely to select and use
appropriate technology to improve their learning performance accordingly. The above findings support the learners who can conveniently
use mobile devices to improve their learning (Wong et al., 2010).
Third, by observing the sequential pattern this study found that the control group did not have thorough discussions (KC 1-1/KC 1-4
and KC 1-1/KC 1-5), meaning that the students mostly shared information and then moved to the “testing and modification of proposed
synthesis” or “agreement statements”, with little negotiation of opinions among participants. In such situations, the depth of discussion is
likely to be insufficient. In contrast, the students of the experimental group exchanged more messages (KC 1-1 / KC 1-3 and KC 1-3 / KC 1-
1), indicating that they had more opportunities to facilitate interaction. Other results showed that learners tended to focus their efforts on
co-constructing new knowledge (KC 1-3 / KC 1-5 and KC 1-4 / KC 1-5). This may reflect how the students focused more on further
exploration of knowledge during the process of problem solving. Moreover, more information sharing and comparison were found in the

Table 9
Results of frequency transition in the experimental group.

KC 1-1 KC 1-2 KC 1-3 KC 1-4 KC 1-5 KC 2-2


KC 1-1 99 15 28 20 0 9
KC 1-2 19 0 3 0 0 0
KC 1-3 24 1 53 14 21 5
KC 1-4 3 0 12 2 19 5
KC 1-5 15 5 11 4 0 5
KC 2-2 10 1 11 2 0 4
Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135 1133

Table 10
Results of sequential analysis for behaviors in the experimental group.

KC 1-1 KC 1-2 KC 1-3 KC 1-4 KC 1-5 KC 2-2


KC 1-1 25.93 0.99 4.85 2.47 3.46 0.79
KC 1-2 2.18 3.46 2.57 3.46 3.46 3.46
KC 1-3 3.66 3.17 12.27 0.69 2.77 1.98
KC 1-4 2.57 3.46 0.10 2.87 2.18 1.98
KC 1-5 0.99 1.98 0.20 2.28 3.46 1.98
KC 2-2 0.49 3.17 0.20 2.87 3.46 2.28

The bold character presents that the z-score is greater than 1.96, indicating the value reaches a level of significance (p < 0.05).

Fig. 9. The behavioral transition diagram of experimental group.

experimental group, indicating students used mobile devices to support online asynchronous discussion and were engaged in exploring
knowledge. Prior research on mobile learning has shown that the mobility and connectivity of the devices enable students to become an
active participant, not a passive receiver, in learning activities (Looi et al., 2010; Vavoula et al., 2009; Wong et al., 2010; Yang & Lin, 2010).
When learners become contributors (e.g., learners could actively discover their learning contents for themselves, rather than by passive
guidance to achieve the purpose of knowledge acquisition and sharing), they demonstrate informed participation to explore large problem
spaces, learn from their peers, and create new understandings, a finding supported by prior research (Vavoula et al., 2009; Wong et al., 2010;
Yang & Lin, 2010).
In sum, this research provides evidence to show that mobile devices can facilitate and assist learners’ social construction of the
knowledge process. Based on the problem-based learning, learners can construct their knowledge through different learning activities such
as raising questions, collecting and sharing information, discussing with peers, and discovering the solution to a problem. Apart from the
support of the online asynchronous interactions, the mobile device offers a new discussion strategy. For instance, learners can facilitate
learning activity in the outside world and further acquire the context-aware learning materials to enhance their learning experience. Most
importantly, mobile devices deliver information effectively for learners during their learning activities. As a result, the functions of mobile
devices provide opportunities for learners to enhance interaction among group members, which is one of the key elements of social
knowledge construction (Gunawardena et al., 1997; Vygotsky, 1978).

6. Conclusions

Although Hou et al. (2008) have examined the learner’s level of knowledge construction in problem-solving-based online asynchronous
discussion, there seems to be no examination of the effect between sequential analysis and the social knowledge construction level. The
major contribution of this study is the introduction of mobile devices to support online asynchronous discussion, further comparing
traditional online asynchronous discussion in the social knowledge construction behavioral patterns. This research coded the problem-
based online asynchronous discussion content, conducted a quantitative content analysis, and a sequential analysis of behavior patterns.
After the analysis of behavioral patterns, this study compared the characteristics and behavioral differences between the experimental and
control groups using different asynchronous discussion strategies. This study suggests that using a mobile device to support online
asynchronous discussion actions, produces a considerable improvement in student performance.
Following the analysis of the frequency and percentage of knowledge construction codes, this study found that mobile devices are
beneficial to discussion tasks and social knowledge construction. Students in the experimental group have more academic-related
discussions, especially on the "sharing and comparing of information or co-construction of knowledge". In the sequential analysis aspect
of problem-based discussion behavior, this study found the behavioral sequences of the experimental group, achieved a more diverse social
construction of knowledge process. This also indicates that mobile devices used to support online discussion activity, is more helpful in
summarizing the agreement statement process for students. Furthermore, compared with the results of the control group, these differences
indicate a correlation between discussion quality and knowledge construction. This study may offer teachers some ideas about specific ways
they can reduce limitations in mobile devices to support discussions. This result has important implications for pedagogies.
Finally, the rapid development of wireless networking technology and mobile technologies offer educators new opportunities to shape
discussion activity. Compared to online asynchronous discussions, behavioral analysis of utilizing mobile devices to support discussions is
very limited. However, this research is limited by focusing on the learning method of the problem-based learning. Exploration focused only
on the pattern of the problem-based online asynchronous discussion behavior. Because there is no further comparison and analysis of
1134 Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135

problem’s statement type (i.e., well-structured problem or ill-structured problem), further research can be carried out on the behavioral
patterns and the difference between problem’s statement type and knowledge types in mobile devices, to support online discussion activity.

Appendix. Two examples of the designed learning activity

Topic 1: shopping for hardware


C Step 1: Your friend has told you that she wishes to purchase a personal computer to help keep track of her purchases, sales, customers,
and finances. Your friend knows that you are taking a course in information system and wants your advice about which computer she
should buy. Before offering your recommendation, you feel that you need to learn more about the available hardware. You plan to give
her a brief review of there possible computer she could buy d two desktop computers and one laptop.
C Step 2: In small groups, select three computers to review. Then collect the following information on each one: 1) input, storage,
processing, and output specifications; 2) relative speed; 3) features and options; 4) price; and 5) ease of use. Secure information from
vendors of the product, literature about the product, articles in computer magazines and journals, interviews with users, and your own
tests of the computer.
C Step 3: Write the reviews. Include the specifications, advantages, disadvantages, and potential uses of each computer. Prepare
sufficient copies to share with other members.
C Step 4: In small groups, answer the following questions:
1. What are the basic features of the computers you reviewed?
2. How do the computers differ?
3. What tradeoffs does a computer buyer make when selecting a computer?
4. What criteria should a buyer use when purchasing a computer?

Topic 2: evaluating computer hardware


C Step 1: Select a department in your college or university or in an organization of your choice.
C Step 2: In small groups, design a checklist or questionnaire for describing the computer hardware technology used by the department
you selected. The checklist or questionnaire should allow you to specify the types and characteristics of the input, storage, processing,
and output devices the department uses.
C Step 3: Visit the department and administer the checklist or questionnaire to obtain an inventory and description of the hardware used
in that department. Briefly interview one or two department members about their experiences using the equipment.
C Step 4: Share your results. Compare and contrast the types of equipment available for various users.
1. How well do you think the equipment meets the department’s needs?
2. What changes in hardware might improve information processing in the department?

References

Ajayi, L. (2009). An exploration of pre-service teachers’ perceptions of learning to teach while using asynchronous discussion board. Educational Technology & Society, 12(2),
86–100.
Arámbula-Greenfield, T. (1996). Implementing problem-based learning in a college science class: testing problem-solving methodology as a viable alternative to traditional
science-teaching techniques. Journal of College Science Teaching, 26(1), 26–30.
Bakeman, R., & Gottman, J. M. (1997). Observing interaction: An introduction to sequential analysis (2nd ed.). Cambridge, UK: Cambridge University Press.
Benbunan-Fich, R., & Hiltz, S. R. (1999). Impacts of asynchronous learning networks on individual and group problem solving: a field experiment. Group Decision and
Negotiation, 8(5), 409–426.
Boud, D., & Feletti, G. (1991). The challenge of problem based learning. New York: St. Martin’s Press.
Branon, R. F., & Essex, C. (2001). Synchronous and asynchronous communication tools in distance education: a survey of instructors. TechTrends, 45, 36–42.
Chang, C. S., Chen, T. S., & Hsu, W. H. (2011). The study on integrating WebQuest with mobile learning for environmental education. Computers & Education, 57(1), 1228–1239.
Chen, H. R., & Huang, H. L. (2010). User Acceptance of mobile knowledge management learning system: design and analysis. Educational Technology & Society, 13(3), 70–77.
Cheng, S. C., Hwang, W. Y., Wu, S. Y., Shadiev, R., & Xie, C. H. (2010). A mobile device and online system with contextual familiarity and its effects on English learning on
campus. Educational Technology & Society, 13(3), 93–109.
Cheng, C. K., Paré, D. E., Collimore, L. M., & Joordens, S. (2011). Assessing the effectiveness of a voluntary online discussion forum on improving students’ course performance.
Computers & Education, 56(1), 253–261.
Churchill, D., & Churchill, N. (2008). Educational affordances of PDAs: a study of a teacher’s exploration of this technology. Computers & Education, 50(4), 1439–1450.
Curtis, D. D., & Lawson, M. J. (2001). Journal of Asynchronous Learning Networks, 5(1), 21–34.
Gunawardena, C. N., Carabajal, K., & Lowe, C. A. (2001). Critical analysis of models and methods used to evaluate online learning networks. In American Educational Research
Association annual meeting. Seattle: American Educational Research Association.
Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of global online debate and the development of an interaction analysis model for examining social
construction of knowledge in computer conferencing. Journal of Educational Computing Research, 17(4), 397–431.
Harasim, L. M. (1993). Collaborating in cyberspace: using computer conferences as a group learning environment. Interactive Learning Environments, 3(2), 119–130.
Heo, H., Lim, K. Y., & Kim, Y. (2010). Exploratory study on the patterns of online interaction and knowledge co-construction in project-based learning. Computers & Education,
55(3), 1383–1392.
Hew, K. F., & Cheung, W. S. (2003). Evaluating the participation and quality of thinking of pre-service teachers in an asynchronous online discussion environment: part 1.
International Journal of Instructional Media, 30(3), 247–262.
Hew, K. F., & Cheung, W. S. (2008). Attracting student participation in asynchronous online discussions: a case study of peer facilitation. Computers & Education, 51(3), 1111–
1124.
Hou, H. T., Chang, K. E., & Sung, Y. T. (2008). Analysis of problem-solving based online asynchronous discussion pattern. Educational Technology & Society, 11(1), 17–28.
Hou, H. T., Sung, Y. T., & Chang, K. E. (2009). Exploring the behavioral patterns of an online knowledge sharing discussion activity among teachers with problem-solving
strategy. Teaching and Teacher Education, 25(1), 101–108.
Hou, H. T., & Wu, S. Y. (2011). Analyzing the social knowledge construction behavioral patterns of an online synchronous collaborative discussion instructional activity using
an instant messaging tool: a case study. Computers & Education, 57(2), 1459–1468.
Huang, Y. M., Jeng, Y. L., & Huang, T. C. (2009). An educational mobile blogging system for supporting collaborative learning. Educational Technology & Society, 12(2), 163–175.
Hwang, G. J., & Chang, H. F. (2011). A formative assessment-based mobile learning approach to improving the learning attitudes and achievements of students. Computers &
Education, 56(1), 1023–1031.
Jeng, Y. L., Wu, T. T., Huang, Y. M., Tan, Q., & Yang, S. J. H. (2010). The add-on impact of mobile applications in learning strategies: a review study. Educational Technology &
Society, 13(3), 3–11.
Y.-F. Lan et al. / Computers & Education 59 (2012) 1122–1135 1135

Jeong, A. C. (2003). The sequential analysis of group interaction and critical thinking in online threaded discussions. The American Journal of Distance Education, 17(1), 25–43.
Johnson, G. M. (2006). Synchronous and asynchronous text-based CMC in educational contexts: a review of recent research. TechTrends, 50(4), 46–53.
Johnson, D. W., & Johnson, R. T. (1996). Cooperation and the use of technology. In D. H. Jonassen (Ed.), Handbook of research for educational communications and technology (pp.
1017–1044). New York: Simon and Schuster Macmillan.
Keating, S., & Gabb, R. (2006). PBL in engineering: Student expectations in 2006. Postcompulsory Education Centre, Victoria University. Retrieved 15.11.11, from. http://tls.vu.edu.
au/portal/site/research/resources/PBL_student_expectations_2006.pdf.
Kelson, A., & Distlehorst, L. (2000). Groups in problem-based learning: essential elements in theory and practice. In D. Evenson, & C. Hmelo (Eds.), Problem-based learning: A
research perspective on learning interactions (pp. 167–184). London: Lawrence Erlbaum Associates.
Lally, V., & De Laat, M. (2003). A quartet in E. In B. Wasson, S. Ludvigsen, & U. Hoppe (Eds.), Designing for change in networked learning environments (pp. 47–56). Dordrecht:
Kluwer Academic Publishers.
Lee, S. W. Y., & Tsai, C. C. (2011). Identifying patterns of collaborative knowledge exploration in online asynchronous discussions. Instructional Science, 39(3), 321–347.
Leidner, D. E., & Jarvenpaa, S. L. (1995). The use of information technology to enhance management school education – a theoretical view. MIS Quarterly, 19(3), 265–291.
Looi, C. K., Seow, P., Zhang, B., So, H. J., Chen, W., & Wong, L. H. (2010). Leveraging mobile technology for sustainable seamless learning: a research agenda. British Journal of
Educational Technology, 42(1), 154–169.
Louca, L., Druin, A., Hammer, D., & Dreher, D. (2003). Students’ collaborative use of computer-based programming tools in science: a descriptive study. In B. Wasson,
S. Ludvigsen, & U. Hoppe (Eds.), Designing for change in networked learning environments (pp. 109–118). Dordrecht: Kluwer Academic Publishers.
Marra, R. M., Moore, J. L., & Klimczak, A. K. (2004). Content analysis of online discussion forums: a comparative analysis of protocols. Educational Technology, Research and
Development, 52(2), 23–40.
Newman, D. R., Webb, B., & Cochrane, C. (1995). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Interpersonal
Computing and Technology, 3(2), 56–77.
Pownell, D., & Bailey, G. D. (2000). The next small thing: handheld computing for educational leaders. Learning and Leading with Technology, 27(8), 46–49.
Roschelle, J. (2003). Keynote paper: unlocking the learning value of wireless mobile devices. Journal of Computer Assisted Learning, 19(3), 260–272.
Rourke, L., & Anderson, T. (2004). Validity in quantitative content analysis. Educational Technology, Research and Development, 52(1), 5–18.
Sabau, I. (2005). Effective asynchronous communication online. Retrieved 15.11.11, from. http://breeze.ucalgary.ca/p52308523.
Sandberg, J., Maris, M., & de Geus, K. (2011). Mobile English learning: an evidence-based study with fifth graders. Computers & Education, 57(1), 1334–1347.
Savoie, J. M., & Hughes, A. S. (1994). Problem-based learning as classroom solution. Educational Leaderchip, 52(3), 54–57.
Schmidt, H. G., & Moust, J. H. (2000). Factors affecting small-group tutorial learning: a review of research. In D. H. Evenson, & C. E. Hmelo (Eds.), Problem-based learning: A
research perspective on learning interactions (pp. 19–52). Mahwah, NJ: Lawrence Erlbaum.
Shana, Z. (2009). Learning with technology: using discussion forums to augment a traditional-style class. Educational Technology & Society, 12(3), 214–228.
Shapley, P. (2000). Online education to develop complex reasoning skills in organic chemistry. Journal of Asynchronous Learning Networks, 4(2), 55–65.
Sharples, M., & Beale, R. (2003). A technical review of mobile computational devices. Journal of Computer Assisted Learning, 19(3), 392–395.
Sing, C. C., & Khine, M. S. (2006). An analysis of interaction and participation patterns in online community. Educational Technology & Society, 9(1), 250–261.
Slavin, R. E. (1995). Cooperative learning: Theory, research, and practice (2nd ed.). Massachusetts: Allyn & Bacon.
Stahl, G. (2002). Contributions to a theoretical framework for CSCL. In G. Stahl (Ed.), Computer support for collaborative learning: Foundations for a CSCL community (pp. 62–71).
Hillsdale, NJ: Lawrence Erlbaum Associates.
Sung, Y. T., Hou, H. T., Liu, C. K., & Chang, K. E. (2010). Mobile guide system using problem-solving strategy for museum learning: a sequential learning behavioral pattern
analysis. Journal of Computer Assisted Learning, 26(2), 106–115.
Trop, L., & Sage, S. (2002). Problems as possibilities: Problem-based learning for K-16 education (2nd ed.). Alexandria, VA: Association for Supervision and Curriculum
Development.
Vavoula, G., Sharples, M., Rudman, P., Meek, J., & Lonsdale, P. (2009). Myartspace: design and evaluation of support for learning with multimedia phones between classrooms
and museums. Computers & Education, 53(2), 286–299.
Vonderwell, S. (2003). An examination of asynchronous communication experiences and perspectives of students in an online course: a case study. Internet and Higher
Education, 6(1), 77–90.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological process. Cambridge, MA: Harvard University Press.
Wever, B. D., Schellens, T., Valcke, M., & Keer, H. V. (2006). Content analysis schemes to analyze transcripts of online asynchronous discussion groups: a review. Computers &
Education, 46(1), 6–28.
Wong, L. H., Chin, C. K., Tan, C. L., & Liu, M. (2010). Students’ personal and social meaning making in a Chinese idiom mobile learning environment. Educational Technology &
Society, 13(4), 15–26.
Yang, J. C., & Lin, Y. L. (2010). Development and evaluation of an interactive mobile learning environment with shared display groupware. Educational Technology & Society,
13(1), 195–207.
Zurita, G., & Nussbaum, M. (2004). Computer supported collaborative learning using wirelessly interconnected handheld computers. Computers & Education, 42(3), 289–314.

You might also like