Professional Documents
Culture Documents
discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/310303396
CITATIONS READS
0 41
3 authors:
Tom Adawi
Chalmers University of Technology
23 PUBLICATIONS 82 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
"Sensing Planet Earth" - Chalmers' free Massive Open Online Courses (MOOCs) on Earth observations
View project
All content following this page was uploaded by Christian Stöhr on 15 November 2016.
The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document
and are linked to publications on ResearchGate, letting you access and read them immediately.
Comparing Student Activity and Performance in the Classroom and
a Virtual Learning Environment
Christian Stöhr, Christophe Demazière and Tom Adawi
Chalmers University of Technology, Sweden
christian.stohr@chalmers.se
demaz@chalmers.se
tom.adawi@chalmers.se
Abstract: In recent years, we have witnessed an increasing use of e-learning in higher education, triggered by both new
educational technologies and new pedagogical approaches. This development raises questions about how students learn in
virtual learning environments (VLEs) compared to traditional classroom environments. While several case studies have
examined this question, they are often based on single course iterations and there is a lack of longitudinal and quasi-
experimental comparative studies. In this paper, we examine how student activity and performance are related in a graduate
course in applied physics that was reformed by replacing the traditional classroom environment with a VLE. We use
longitudinal data from six iterations of the course, of which half were campus based and half were conducted online. We
analyse quantitative data based on home assignments, the students’ participation in course activities as well as the quantity
and quality of questions that students posed during the course. The results show that there is no statistically significant
difference in the students’ average performance across the two formats. However, for the VLE there is a substantially greater
variation in individual performance. Moreover, the participation in synchronous activities in the VLE, such as online wrap-up
sessions and tutorials, is strongly correlated with improved student performance. Further, students that asked content-
related questions are more likely to achieve better outcomes. We conclude that despite the reported benefits of video
lectures, even when augmented with built-in quizzes, these are not sufficient for encouraging a deep approach to learning.
Our results provide further evidence that video lectures need to be accompanied by other learning activities that require
students to engage in higher-order thinking skills. We discuss our results in the light of related empirical and theoretical
work, and highlight implications for teaching in blended and virtual learning environments.
Keywords: virtual learning environments, learning analytics, student performance, student activity, higher education
1. Introduction
The rapid development and availability of information technology and the internet has triggered technology
driven change in higher education. Universities as traditional institutions of higher education are increasingly
enhancing traditional classroom experiences with open, web-based and blended forms of teaching and learning
by, for example, incorporating Learning Management Systems and Virtual Learning Environments (VLEs; e.g.
Britain and Liber, 2004) in campus education or by engaging with Massive Open Online Courses (MOOCs; e.g.
Daniel, 2012). Some teachers attempt to combine the benefits of web-based learning and in-class meetings in
more blended forms, for example by applying a flipped classroom model (Bishop and Verleger, 2013).
For more than two decades, scholars have tried to examine how effective web-based courses are compared to
traditional classroom education in higher education. Research on technology enhanced learning in the early and
late 90s mainly focused on video and audio technology. This strand of research has not found a significant
difference in student performance in traditional face-to-face instruction versus technology-supported
environments (see Russell, 1999 for a large review). Thus, one can argue that it is not the technology itself, but
the instructional model that determines learning outcomes (Piccoli et al, 2001).
In contrast to early uses of technology in education, modern technologies - particular web based technologies
provided by VLEs - have the potential to significantly alter the educational environment. VLEs provide a number
of advantages in terms of increased convenience and flexibility. One of the key arguments for using VLEs is the
increased amount of learner control – the students need to make their own decisions about pace, flow or order
of instruction (Williams, 1996). Laurillard (2002) has classified different media forms of learning technologies
that VLEs can support:
! Narrative media for attending and apprehending (e.g. texts, videos, audio, images)
! Interactive media for investigating and exploring (e.g. libraries, search engines, web links)
! Communicative media for discussing and debating (e.g. discussion forum, chats, email)
! Adaptive media for experimenting and practicing (e.g. online simulation, virtual labs, online quizzes)
664
Christian Stöhr, Christophe Demazière and Tom Adawi
! Productive media for expressing and presenting (e.g. blogs, wikis, word processor, spreadsheet)
Most VLEs, however, appear to primarily focus on the first and second (narrative and interactive) media
(Lameras et al, 2012). Research on web-based training and VLEs became more apparent around 15 years ago. In
one of the first studies, Piccoli et al (2001) did not find any significant difference in student performance between
VLEs and traditional teaching. They did, however, report an increased self-efficiency but lower levels of student
satisfaction in VLEs. Similarly, de Jong et al (2013) and Li et al (2014) also reported an indistinguishable
performance between classroom-based teaching and online teaching. Li et al. (2014), however, pointed out that
e-learning enhanced higher-level learning. On the other hand, Chou and Liu (2005), for example, found that
students in VLEs show better performance, higher levels of self-efficacy, satisfaction and learning. Al-Qahtani
and Higgins (2013), comparing classroom teaching, online learning and blended learning in an experimental
design, concluded that the latter outperformed the two others in terms of student performance, but observed
no difference between the pure classroom and online format. In a recent review of research on the flipped
classroom model, O'Flaherty and Phillips (2015) indicated that there is a lot of indirect evidence of improved
student performance and satisfaction for this blended learning model. Thus, blended forms of learning appear
to be preferred over either one of the pure online or classroom teaching environments, a result that also is
confirmed by Tayebinik and Puteh (2013). Nevertheless, O'Flaherty and Phillips (2015) also stressed that there
is a lack of conclusive evidence that blended learning contributes to building lifelong learning and other 21st
century skills. Finally, McCutcheon et al (2015) pointed at the wide variation in the type of online and blended
learning approaches. They nevertheless concluded that online learning is as effective as traditional teaching and
were not able to provide any clear conclusions for blended learning approaches.
In sum, while there are no definite findings on whether or not traditional teaching outperforms online learning
or vice versa, there are indications that approaches that are sensitive to the particularities of both the campus
and online learning environments result in better student performance. Thus, the rapid technological
development needs to be accompanied by pedagogical approaches that fit technology-based learning
environments and further research is needed to explore conditions that optimize student learning in VLEs.
In this article, we contribute to this discussion by examining how the teaching environment, student activity and
student performance are related in a graduate course in applied physics. The course (taught by the second
author) was first delivered in a traditional classroom environment, but was later revised and delivered in a VLE.
Thus, we have longitudinal data over six course iterations. The use of educational technology also made it
possible to collect new kinds of evaluation data, enabling us not only to analyse student performance in tests
but also to examine their activity for different, ungraded elements of the course. More specifically, two research
questions are addressed in this article:
! How does the reform from a traditional classroom environment to a VLE affect student performance?
! How does student activity and feedback impact student performance in the VLE?
2. Methods
In this study, we examine the relation between learning environment, student activity and performance using
quantitative data based on home assignments, the students’ participation in course activities as well as the
quantity and quality of questions that the students posed during the course. In this section, we describe the
pedagogical approach and set-up of the course during the different iterations. We start with a description of the
course, followed by information about the data collection and analysis process.
665
Christian Stöhr, Christophe Demazière and Tom Adawi
their own. In order to pass the course, students need to attend at least 75% of the web-casts, complete all
summaries (that are nevertheless ungraded) and the home assignments.
We use longitudinal data from yearly iterations of the course over six years, starting in 2009, and during which
the course was reformed from a campus-based format to a pure web format. During the first two years, the
course was delivered in a traditional campus-based format. In the third year, the course was also offered in a
campus-based format, but the in-class sessions were recorded and made available to the students after class.
For the three following consecutive years, the course was entirely delivered in a VLE. In year four, the course
was offered in a web-based format with asynchronous parts (webcasts) and synchronous parts (tutorials). During
the last two years, the web-based format was enhanced by including online quizzes and synchronous wrap-up
sessions, and the final version of the course consisted of the following elements:
! Pre-recorded lectures (webcasts) that were available to students for on-demand viewing;
! Tutorials that were live-broadcasted with synchronous interactions between the students and the teachers;
! Online quizzes embedded in the webcasts and that focused on conceptual understanding;
! Regular synchronous wrap-up sessions designed to address the students’ needs and based on the input
from the students; and
! A discussion forum.
The wrap-up sessions were part of a Just-in-Time Teaching approach (JiTT; Watkins and Mazur, 2010) and
designed to address the students’ needs and based on the input from the students. In addition, students were
provided with feedback prompts to pose questions to the teachers while watching the lectures. They also had
the opportunity to rate the lectures and provide more specific feedback on the lectures. The discussion fora
were created to enable students to interact with each other, and provided valuable information for formative
feedback and necessary interventions. Nevertheless, the discussion fora were exclusively used for the home
assignments.
The course is hosted on a VLE (or learning management platform) called Ping Pong, developed by Ping Pong AB,
Sweden. This VLE is used as an entry portal to all electronic resources in the course. In addition, Ping Pong is
utilized for managing the online quizzes, the rating of the webcasts, the feedback on the webcasts, and the
discussion fora. The webcasts are recorded and broadcasted (for on-demand viewing exclusively) using
Mediasite, developed by Sonic Foundry Inc., USA. Finally, the wrap-up and tutorials are live-streamed, as well as
recorded for later on-demand viewing, using Adobe Connect, developed by Adobe Systems Inc., USA.
666
Christian Stöhr, Christophe Demazière and Tom Adawi
3. Results
Figure 1: Average student scores for the four home assignments over six course iterations
667
Christian Stöhr, Christophe Demazière and Tom Adawi
The analysis of student performance can be enhanced by not only looking at the average scores on the home
assignments, but also at the variation between students. The standard deviation for the four assignments is
presented in Figure 2. For the traditional format, the standard deviation is generally significantly lower than for
the VLE. So while in the traditional format the results of the home assignments were fairly homogeneous among
the students, we see a much larger variation in student performance for the online format. For the course
iterations 2012/13 as well as 2014/2015 the standard deviation was comparatively high for all three respective
four assignments. The year 2013/14 appears somewhat exceptional with high variation in home assignments #1
and #2 but a comparatively low standard deviation for assignments #3 and #4.
Figure 2: Standard deviations for the four home assignments over six course iterations
668
Christian Stöhr, Christophe Demazière and Tom Adawi
Finally, we categorized the content-related questions according to their cognitive level using the revised version
of Bloom’s taxonomy (Krathwohl, 2002). Due to the limited number of questions, this analysis could only be
meaningfully done for 2013/2014. From Figure 5, it can be seen that with the exception of “creating”, the
students posed question covering all cognitive levels of the taxonomy. Most of the questions posed by the
students belong to the “understanding” level, followed by the “analyzing” level and the “evaluating” level.
However, it is worth noting that more than half of the questions were at a higher level of understanding (above
“remembering” and “evaluating”). The two content-related questions in the last course iterations were
categorized in the categories “evaluating” and “creating”.
669
Christian Stöhr, Christophe Demazière and Tom Adawi
be explained by the fact that an online course, with its increased flexibility, requires a higher degree of self-
regulation. In other words, VLEs place higher demands on students for taking responsibility and control over
their learning. It is therefore important to consider how to provide struggling students with the necessary
support and scaffolding in online and blended learning environments, such as the increasingly popular flipped
classroom model.
Figure 5: Classification of student questions according to their cognitive level in Bloom’s taxonomy
Comparing student performance and participation in the online wrap-up sessions and tutorials in the VLE, we
see that activity and performance was high in 2013/2014 but both were low in the following year. This is an
indicator confirming that the students’ participation in synchronous sessions and performance are correlated.
Thus, synchronous parts in a course (online or in class) appear to be of central importance for student learning
and asynchronous video lectures alone are not sufficient to create deep learning, even when augmented with
built-in quizzes.
Similarly, we observed that high content-related activity in the form of questions sent to the teachers and
subsequent interactions was accompanied by better student performance. Note that it was not the number of
questions itself that made the difference, but whether or not questions at a higher level of understanding were
posted. Questions of administrative nature did not contribute to learning. However, while both activity and
formulation of questions appear highly correlated with performance, and a causal relationship might appear
intuitively correct and consistent with the predictions of modern learning theories, our analysis is limited by the
use of different data sources on aggregate level. We cannot exclude the possibility that those results are due to
common co-variates. For example, more motivated students might also be more likely to pose questions and
also perform better.
In sum, we conclude that the use of a VLE is not better or worse than traditional teaching, but teachers need to
consider and help their students to adapt to the specific benefits and challenges of VLEs. Video lectures alone
are not sufficient for encouraging a deep approach to learning, but need to be accompanied by other learning
activities that require students to engage in higher-order thinking skills. However, as reported recently in a
review by Maarop and Embi (2016), instructors engaging in online and blended learning often struggle to
develop such a well-crafted design, and need support in terms of time, pedagogical and technical skills and
finding the right balance between classroom, online and blended teaching. Thus, making the most of online and
blended teaching is not only a challenge for teachers but also institutional actors in higher education.
References
Al-Qahtani, A.A. and Higgins, S.E. (2013) “Effects of traditional, blended and e-learning on students' achievement in higher
education”, Journal of Computer Assisted Learning, Vol 29, No. 3, pp 220-234.
Bishop, J.L. and Verleger, M.A. (2013) “The flipped classroom: A survey of the research”. ASEE National Conference
Proceedings, Atlanta, GA.
670
Christian Stöhr, Christophe Demazière and Tom Adawi
Britain, S. and Liber, O. (2004) A framework for pedagogical evaluation of virtual learning environments. Research Report,
[Online], Available: https://hal.archives-ouvertes.fr/hal-00696234/document [20 Mar 2016].
Chou, S.W. and Liu, C.H. (2005) “Learning effectiveness in a Web-based virtual learning environment: a learner control
perspective”, Journal of computer assisted learning, Vol 21, No.1, pp 65-76.
Daniel, J. (2012) “Making sense of MOOCs: Musings in a maze of myth, paradox and possibility”. Journal of interactive
Media in education, Vol 2012, No. 3
De Jong, N., Verstegen, D.M.L., Tan, F.E.S., and O’connor, S.J. (2013) “A comparison of classroom and online asynchronous
problem-based learning for students undertaking statistics training as part of a Public Health Masters
degree”, Advances in Health Sciences Education, Vol 18, No. 2, pp 245-264.
Krathwohl, D.R. (2002) “A revision of Bloom's taxonomy: An overview”, Theory into practice, Vol 41, No. 4, pp 212-218.
Lameras, P., Levy, P., Paraskakis, I., and Webber, S. (2012) “Blended university teaching using virtual learning
environments: conceptions and approaches”, Instructional Science, Vol 40, No. 1, pp 141-157.
Laurillard, D. (2002) Rethinking university teaching: A conversational framework for the effective use of learning
technologies (2nd ed.), Routledge, London.
Li, F., Qi, J., Wang, G. and Wang, X. (2014) “Traditional Classroom VS E-learning in Higher Education: Difference between
Students' Behavioral Engagement”, International Journal of Emerging Technologies in Learning, Vol 9, No. 2, pp 48-
51.
Maarop, A.H. and Embi, M.A. (2016) “Implementation of Blended Learning in Higher Learning Institutions: A Review of
Literature”, International Education Studies, Vol 9, No.3, pp 41.
McCutcheon, K., Lohan, M., Traynor, M. and Martin, D. (2015) “A systematic review evaluating the impact of online or
blended learning vs. face-to-face learning of clinical skills in undergraduate nurse education”, Journal of advanced
nursing, Vol 72, No. 2, pp 255-270.
O'Flaherty, J., and Phillips, C. (2015) “The use of flipped classrooms in higher education: A scoping review”, The Internet
and Higher Education, Vol 25, pp 85-95.
Piccoli, G., Ahmad, R., and Ives, B. (2001) “Web-based virtual learning environments: A research framework and a
preliminary assessment of effectiveness in basic IT skills training”, MIS quarterly, Vol 25, No. 4, pp 401-426.
Russell, T.L. (1999) The no significant difference phenomenon: A comparative research annotated bibliography on
technology for distance education: As reported in 355 research reports, summaries and papers. North Carolina State
University.
Tayebinik, M., and Puteh, M. (2012) “Blended Learning or E-learning?” International Magazine on Advances in Computer
Science and Telecommunications, Vol 3, No. 1, pp 103-110.
Watkins J. and Mazur E. (2010) Just-in-time teaching and peer instruction. In Simkins, S. and Maier, ͒M. (eds.) Just-in-time
teaching: Across the disciplines, and across the academy, Stylus Publishing, Sterlin, VA.
Williams, M. D. (1996) "Learner-Control and Instructional Technologies," in Handbook of Research for Educational
Communications and Technology, D. H. Jonassen (ed.), Simon and Schuster Macmillan, New York.
671