You are on page 1of 9

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/310303396

Comparing Student Activity and Performance


in the Classroom and a Virtual Learning
Environment

Conference Paper · October 2016

CITATIONS READS

0 41

3 authors:

Christian Stöhr Christophe Demazière


Chalmers University of Technology Chalmers University of Technology
18 PUBLICATIONS 52 CITATIONS 87 PUBLICATIONS 433 CITATIONS

SEE PROFILE SEE PROFILE

Tom Adawi
Chalmers University of Technology
23 PUBLICATIONS 82 CITATIONS

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

"Sensing Planet Earth" - Chalmers' free Massive Open Online Courses (MOOCs) on Earth observations
View project

All content following this page was uploaded by Christian Stöhr on 15 November 2016.

The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document
and are linked to publications on ResearchGate, letting you access and read them immediately.
Comparing Student Activity and Performance in the Classroom and
a Virtual Learning Environment
Christian Stöhr, Christophe Demazière and Tom Adawi
Chalmers University of Technology, Sweden
christian.stohr@chalmers.se
demaz@chalmers.se
tom.adawi@chalmers.se

Abstract: In recent years, we have witnessed an increasing use of e-learning in higher education, triggered by both new
educational technologies and new pedagogical approaches. This development raises questions about how students learn in
virtual learning environments (VLEs) compared to traditional classroom environments. While several case studies have
examined this question, they are often based on single course iterations and there is a lack of longitudinal and quasi-
experimental comparative studies. In this paper, we examine how student activity and performance are related in a graduate
course in applied physics that was reformed by replacing the traditional classroom environment with a VLE. We use
longitudinal data from six iterations of the course, of which half were campus based and half were conducted online. We
analyse quantitative data based on home assignments, the students’ participation in course activities as well as the quantity
and quality of questions that students posed during the course. The results show that there is no statistically significant
difference in the students’ average performance across the two formats. However, for the VLE there is a substantially greater
variation in individual performance. Moreover, the participation in synchronous activities in the VLE, such as online wrap-up
sessions and tutorials, is strongly correlated with improved student performance. Further, students that asked content-
related questions are more likely to achieve better outcomes. We conclude that despite the reported benefits of video
lectures, even when augmented with built-in quizzes, these are not sufficient for encouraging a deep approach to learning.
Our results provide further evidence that video lectures need to be accompanied by other learning activities that require
students to engage in higher-order thinking skills. We discuss our results in the light of related empirical and theoretical
work, and highlight implications for teaching in blended and virtual learning environments.

Keywords: virtual learning environments, learning analytics, student performance, student activity, higher education

1. Introduction
The rapid development and availability of information technology and the internet has triggered technology
driven change in higher education. Universities as traditional institutions of higher education are increasingly
enhancing traditional classroom experiences with open, web-based and blended forms of teaching and learning
by, for example, incorporating Learning Management Systems and Virtual Learning Environments (VLEs; e.g.
Britain and Liber, 2004) in campus education or by engaging with Massive Open Online Courses (MOOCs; e.g.
Daniel, 2012). Some teachers attempt to combine the benefits of web-based learning and in-class meetings in
more blended forms, for example by applying a flipped classroom model (Bishop and Verleger, 2013).

For more than two decades, scholars have tried to examine how effective web-based courses are compared to
traditional classroom education in higher education. Research on technology enhanced learning in the early and
late 90s mainly focused on video and audio technology. This strand of research has not found a significant
difference in student performance in traditional face-to-face instruction versus technology-supported
environments (see Russell, 1999 for a large review). Thus, one can argue that it is not the technology itself, but
the instructional model that determines learning outcomes (Piccoli et al, 2001).

In contrast to early uses of technology in education, modern technologies - particular web based technologies
provided by VLEs - have the potential to significantly alter the educational environment. VLEs provide a number
of advantages in terms of increased convenience and flexibility. One of the key arguments for using VLEs is the
increased amount of learner control – the students need to make their own decisions about pace, flow or order
of instruction (Williams, 1996). Laurillard (2002) has classified different media forms of learning technologies
that VLEs can support:
! Narrative media for attending and apprehending (e.g. texts, videos, audio, images)
! Interactive media for investigating and exploring (e.g. libraries, search engines, web links)
! Communicative media for discussing and debating (e.g. discussion forum, chats, email)
! Adaptive media for experimenting and practicing (e.g. online simulation, virtual labs, online quizzes)

664
Christian Stöhr, Christophe Demazière and Tom Adawi

! Productive media for expressing and presenting (e.g. blogs, wikis, word processor, spreadsheet)
Most VLEs, however, appear to primarily focus on the first and second (narrative and interactive) media
(Lameras et al, 2012). Research on web-based training and VLEs became more apparent around 15 years ago. In
one of the first studies, Piccoli et al (2001) did not find any significant difference in student performance between
VLEs and traditional teaching. They did, however, report an increased self-efficiency but lower levels of student
satisfaction in VLEs. Similarly, de Jong et al (2013) and Li et al (2014) also reported an indistinguishable
performance between classroom-based teaching and online teaching. Li et al. (2014), however, pointed out that
e-learning enhanced higher-level learning. On the other hand, Chou and Liu (2005), for example, found that
students in VLEs show better performance, higher levels of self-efficacy, satisfaction and learning. Al-Qahtani
and Higgins (2013), comparing classroom teaching, online learning and blended learning in an experimental
design, concluded that the latter outperformed the two others in terms of student performance, but observed
no difference between the pure classroom and online format. In a recent review of research on the flipped
classroom model, O'Flaherty and Phillips (2015) indicated that there is a lot of indirect evidence of improved
student performance and satisfaction for this blended learning model. Thus, blended forms of learning appear
to be preferred over either one of the pure online or classroom teaching environments, a result that also is
confirmed by Tayebinik and Puteh (2013). Nevertheless, O'Flaherty and Phillips (2015) also stressed that there
is a lack of conclusive evidence that blended learning contributes to building lifelong learning and other 21st
century skills. Finally, McCutcheon et al (2015) pointed at the wide variation in the type of online and blended
learning approaches. They nevertheless concluded that online learning is as effective as traditional teaching and
were not able to provide any clear conclusions for blended learning approaches.

In sum, while there are no definite findings on whether or not traditional teaching outperforms online learning
or vice versa, there are indications that approaches that are sensitive to the particularities of both the campus
and online learning environments result in better student performance. Thus, the rapid technological
development needs to be accompanied by pedagogical approaches that fit technology-based learning
environments and further research is needed to explore conditions that optimize student learning in VLEs.

In this article, we contribute to this discussion by examining how the teaching environment, student activity and
student performance are related in a graduate course in applied physics. The course (taught by the second
author) was first delivered in a traditional classroom environment, but was later revised and delivered in a VLE.
Thus, we have longitudinal data over six course iterations. The use of educational technology also made it
possible to collect new kinds of evaluation data, enabling us not only to analyse student performance in tests
but also to examine their activity for different, ungraded elements of the course. More specifically, two research
questions are addressed in this article:
! How does the reform from a traditional classroom environment to a VLE affect student performance?
! How does student activity and feedback impact student performance in the VLE?

2. Methods
In this study, we examine the relation between learning environment, student activity and performance using
quantitative data based on home assignments, the students’ participation in course activities as well as the
quantity and quality of questions that the students posed during the course. In this section, we describe the
pedagogical approach and set-up of the course during the different iterations. We start with a description of the
course, followed by information about the data collection and analysis process.

2.1 The course


Our analysis is based on the course Modelling of Nuclear Reactors, offered to master and PhD students at
Chalmers University of Technology. The aims of the course are for the students to be able to comprehend and
apply the solution methodology of the computer codes used by the nuclear community for simulating the
behaviour of nuclear reactors. The main emphasis of the modelling course is to derive the typical algorithms,
approximations, and the corresponding limitations, so that the students can apply the codes with confidence in
situations that fall within the validity of the algorithms. The curriculum for the course is organized in four
chapters. For each chapter, students need to submit a home assignment, where they apply some of the
algorithms presented in the course to solve practical problems. Thereby, they have to develop their own codes
and write reports accordingly. In addition, the students write short summaries after each chapter that are peer
reviewed. That creates an opportunity for students to reflect on their peers’ contributions, and indirectly on

665
Christian Stöhr, Christophe Demazière and Tom Adawi

their own. In order to pass the course, students need to attend at least 75% of the web-casts, complete all
summaries (that are nevertheless ungraded) and the home assignments.

We use longitudinal data from yearly iterations of the course over six years, starting in 2009, and during which
the course was reformed from a campus-based format to a pure web format. During the first two years, the
course was delivered in a traditional campus-based format. In the third year, the course was also offered in a
campus-based format, but the in-class sessions were recorded and made available to the students after class.
For the three following consecutive years, the course was entirely delivered in a VLE. In year four, the course
was offered in a web-based format with asynchronous parts (webcasts) and synchronous parts (tutorials). During
the last two years, the web-based format was enhanced by including online quizzes and synchronous wrap-up
sessions, and the final version of the course consisted of the following elements:
! Pre-recorded lectures (webcasts) that were available to students for on-demand viewing;
! Tutorials that were live-broadcasted with synchronous interactions between the students and the teachers;
! Online quizzes embedded in the webcasts and that focused on conceptual understanding;
! Regular synchronous wrap-up sessions designed to address the students’ needs and based on the input
from the students; and
! A discussion forum.
The wrap-up sessions were part of a Just-in-Time Teaching approach (JiTT; Watkins and Mazur, 2010) and
designed to address the students’ needs and based on the input from the students. In addition, students were
provided with feedback prompts to pose questions to the teachers while watching the lectures. They also had
the opportunity to rate the lectures and provide more specific feedback on the lectures. The discussion fora
were created to enable students to interact with each other, and provided valuable information for formative
feedback and necessary interventions. Nevertheless, the discussion fora were exclusively used for the home
assignments.

The course is hosted on a VLE (or learning management platform) called Ping Pong, developed by Ping Pong AB,
Sweden. This VLE is used as an entry portal to all electronic resources in the course. In addition, Ping Pong is
utilized for managing the online quizzes, the rating of the webcasts, the feedback on the webcasts, and the
discussion fora. The webcasts are recorded and broadcasted (for on-demand viewing exclusively) using
Mediasite, developed by Sonic Foundry Inc., USA. Finally, the wrap-up and tutorials are live-streamed, as well as
recorded for later on-demand viewing, using Adobe Connect, developed by Adobe Systems Inc., USA.

2.2 Data collection


Ping Pong, Mediasite, and Adobe Connect all offer the possibility to monitor students’ activity and performance,
even if the level of detail of the available data significantly differs between the three platforms. As all three
platforms collect their data separately, we could not connect the data on an individual student level. Instead,
comparisons between the different factors were made on aggregate level between the different course
iterations. This reduces the interpretability of the analysis somewhat, but still offers relevant results. In the
following we briefly present the different factors we measured and how we operationalized them.

2.2.1 Student performance


To measure performance, we include the students’ results on the home assignments that are available for all six
years. During the first four years, three home assignments were offered. A fourth home assignment was added
thereafter. The addition of a fourth assignment was decided after analysing the results of the final course
evaluations, in which students expressed a wish to have an additional home assignment on the last chapter of
the course. The same assignments were kept during the six iterations of the course, with the fourth assignment
being added from the fifth course iteration. The students’ performance on such assignments can thus reliably
be compared from year to year. The number of students enrolled in the different course iterations was similar
but limited (see Table 1). However, the teacher considered the level of the students at the start of the course as
very similar each year. As a result, inter-comparison of the students’ performance from year to year is possible.
Differences in students’ performance can thus be attributed to the modifications brought to the pedagogical
approach followed during the successive course iterations. We report on average performance as well as
standard deviations for all six course iterations hereafter.

666
Christian Stöhr, Christophe Demazière and Tom Adawi

Table 1: Number of students completing the first home assignment


Year Number of students
2009/2010 14
2010/2011 7
2011/2012 10
2012/2013 10
2013/2014 8
2014/2015 11

2.2.2 Student activity


Student activity relates to the level that students participate in the different course activities. Our data is limited
to the last three years, when the course was delivered in a VLE. We focus on the students’ participation in
synchronous course activities, i.e. the tutorials for the three last course iterations and the wrap up sessions for
the last two iterations, as no comparable wrap-up session was offered during 2012/2013. The monitoring of the
synchronous sessions was performed using the data available from Adobe Connect for all students that also
submitted the home assignments (see Table 1).

2.2.3 Student feedback


The last indicator measures the amount and kind of feedback students give on the course content. That can be
done by monitoring the questions sent to the teachers while the students attend the different webcasts. The
questions were then manually extracted and ordered by topic. After the initial set-up resulted in assignment-
related questions only, the objective of the second course reform in 2013 was to encourage students to pose
and discuss questions that relate not only to the home assignments but also to the contents presented during
the webcasts. Those questions that were not dealing with the home assignment were also extracted and
categorized based on whether they relate to the content or administrative aspects. Finally, those questions that
were content-related were categorized by one of the co-authors according to their cognitive level in Bloom’s
taxonomy (e.g. Krathwohl, 2002).

3. Results

3.1 Analysis of student performance on home assignments


The average student scores for the four home assignments over six course iterations are illustrated in Figure 1.
For home assignment #1-#3, the average performance tends to be better during earlier iterations of the course,
delivered in a traditional classroom format. In particular, for the last course iteration, the performance dropped
significantly for assignments #2 and #3. Similarly, we see a significant drop for assignment #4. Thus, student
performance did not improve by delivering the course in a VLE.

Figure 1: Average student scores for the four home assignments over six course iterations

667
Christian Stöhr, Christophe Demazière and Tom Adawi

The analysis of student performance can be enhanced by not only looking at the average scores on the home
assignments, but also at the variation between students. The standard deviation for the four assignments is
presented in Figure 2. For the traditional format, the standard deviation is generally significantly lower than for
the VLE. So while in the traditional format the results of the home assignments were fairly homogeneous among
the students, we see a much larger variation in student performance for the online format. For the course
iterations 2012/13 as well as 2014/2015 the standard deviation was comparatively high for all three respective
four assignments. The year 2013/14 appears somewhat exceptional with high variation in home assignments #1
and #2 but a comparatively low standard deviation for assignments #3 and #4.

Figure 2: Standard deviations for the four home assignments over six course iterations

3.2 Analysis of student attendance in synchronous sessions


Figures 3 and 4 show student attendance in the live-broadcasted sessions, i.e. the four tutorials and the four
wrap-up sessions. While the available data are limited, Figure 3 shows that student attendance in the tutorials
was at similar levels for the first and second tutorial, but was significantly lower in the last iteration of the course
for the third and fourth tutorial. With respect to attendance in the wrap-up sessions, a great difference between
the two course iterations was observed (Figure 4). While over two thirds of the students participated in all the
wrap-up sessions during 2013/2014, only about one third of the students did so in the following year.

Figure 3: Student attendance in the tutorial sessions

668
Christian Stöhr, Christophe Demazière and Tom Adawi

Figure 4: Student attendance in the wrap-up sessions

3.3 Analysis of student questions sent to the teacher


We also analyzed the questions posed to the teachers by looking at the the amount and quality of course
content-related questions posted (Table 2). Although a forum was available in the course since 2012/2013, the
forum was exclusively used for questions related to the home assignments. After the course reform, the teachers
also received questions on the home assignments. Home assignment-related questions (either sent to the
teachers or collected via the forum) are not considered in the analysis reported hereafter.
Table 2: Questions sent to the teachers (excluding questions related to home assignments)
Academic year Total number Number of content- Number of administrative
of questions related questions questions
2012/2013 0 0 0
2013/2014 39 16 (42%) 23 (58%)
2014/2015 57 2 (4%) 55 (96%)
In the course iteration 2013/2014 the students posted 39 questions on aspects of the course that are not related
to the home assignments, of which a bit more than half (58%) was of administrative nature and the rest were
content-related. In the academic year 2014/2015 even more questions (57) were received. However, 96% of
those were of administrative nature, only 2 of the questions addressed content aspects of the course.

Finally, we categorized the content-related questions according to their cognitive level using the revised version
of Bloom’s taxonomy (Krathwohl, 2002). Due to the limited number of questions, this analysis could only be
meaningfully done for 2013/2014. From Figure 5, it can be seen that with the exception of “creating”, the
students posed question covering all cognitive levels of the taxonomy. Most of the questions posed by the
students belong to the “understanding” level, followed by the “analyzing” level and the “evaluating” level.
However, it is worth noting that more than half of the questions were at a higher level of understanding (above
“remembering” and “evaluating”). The two content-related questions in the last course iterations were
categorized in the categories “evaluating” and “creating”.

4. Discussion and conclusions


In this article, we have examined student performance in an applied physics course that was reformed by
replacing the traditional classroom environment with a VLE. The results show that there is no statistically
significant difference in the students’ average performance across the two formats. Thus, the use of a VLE did
not improve student performance. If anything, and against our expectations, performance even got somewhat
worse. This is consistent with most of the results from previous research presented in the introduction. However,
from the analysis of standard deviations, we conclude that there are significant differences between the two
formats. For the VLE, there is a substantially greater variation in the individual student performances. It seems
that for VLEs, stronger students become even stronger, while struggling students perform even worse. This could

669
Christian Stöhr, Christophe Demazière and Tom Adawi

be explained by the fact that an online course, with its increased flexibility, requires a higher degree of self-
regulation. In other words, VLEs place higher demands on students for taking responsibility and control over
their learning. It is therefore important to consider how to provide struggling students with the necessary
support and scaffolding in online and blended learning environments, such as the increasingly popular flipped
classroom model.

Figure 5: Classification of student questions according to their cognitive level in Bloom’s taxonomy
Comparing student performance and participation in the online wrap-up sessions and tutorials in the VLE, we
see that activity and performance was high in 2013/2014 but both were low in the following year. This is an
indicator confirming that the students’ participation in synchronous sessions and performance are correlated.
Thus, synchronous parts in a course (online or in class) appear to be of central importance for student learning
and asynchronous video lectures alone are not sufficient to create deep learning, even when augmented with
built-in quizzes.

Similarly, we observed that high content-related activity in the form of questions sent to the teachers and
subsequent interactions was accompanied by better student performance. Note that it was not the number of
questions itself that made the difference, but whether or not questions at a higher level of understanding were
posted. Questions of administrative nature did not contribute to learning. However, while both activity and
formulation of questions appear highly correlated with performance, and a causal relationship might appear
intuitively correct and consistent with the predictions of modern learning theories, our analysis is limited by the
use of different data sources on aggregate level. We cannot exclude the possibility that those results are due to
common co-variates. For example, more motivated students might also be more likely to pose questions and
also perform better.

In sum, we conclude that the use of a VLE is not better or worse than traditional teaching, but teachers need to
consider and help their students to adapt to the specific benefits and challenges of VLEs. Video lectures alone
are not sufficient for encouraging a deep approach to learning, but need to be accompanied by other learning
activities that require students to engage in higher-order thinking skills. However, as reported recently in a
review by Maarop and Embi (2016), instructors engaging in online and blended learning often struggle to
develop such a well-crafted design, and need support in terms of time, pedagogical and technical skills and
finding the right balance between classroom, online and blended teaching. Thus, making the most of online and
blended teaching is not only a challenge for teachers but also institutional actors in higher education.

References
Al-Qahtani, A.A. and Higgins, S.E. (2013) “Effects of traditional, blended and e-learning on students' achievement in higher
education”, Journal of Computer Assisted Learning, Vol 29, No. 3, pp 220-234.
Bishop, J.L. and Verleger, M.A. (2013) “The flipped classroom: A survey of the research”. ASEE National Conference
Proceedings, Atlanta, GA.

670
Christian Stöhr, Christophe Demazière and Tom Adawi

Britain, S. and Liber, O. (2004) A framework for pedagogical evaluation of virtual learning environments. Research Report,
[Online], Available: https://hal.archives-ouvertes.fr/hal-00696234/document [20 Mar 2016].
Chou, S.W. and Liu, C.H. (2005) “Learning effectiveness in a Web-based virtual learning environment: a learner control
perspective”, Journal of computer assisted learning, Vol 21, No.1, pp 65-76.
Daniel, J. (2012) “Making sense of MOOCs: Musings in a maze of myth, paradox and possibility”. Journal of interactive
Media in education, Vol 2012, No. 3
De Jong, N., Verstegen, D.M.L., Tan, F.E.S., and O’connor, S.J. (2013) “A comparison of classroom and online asynchronous
problem-based learning for students undertaking statistics training as part of a Public Health Masters
degree”, Advances in Health Sciences Education, Vol 18, No. 2, pp 245-264.
Krathwohl, D.R. (2002) “A revision of Bloom's taxonomy: An overview”, Theory into practice, Vol 41, No. 4, pp 212-218.
Lameras, P., Levy, P., Paraskakis, I., and Webber, S. (2012) “Blended university teaching using virtual learning
environments: conceptions and approaches”, Instructional Science, Vol 40, No. 1, pp 141-157.
Laurillard, D. (2002) Rethinking university teaching: A conversational framework for the effective use of learning
technologies (2nd ed.), Routledge, London.
Li, F., Qi, J., Wang, G. and Wang, X. (2014) “Traditional Classroom VS E-learning in Higher Education: Difference between
Students' Behavioral Engagement”, International Journal of Emerging Technologies in Learning, Vol 9, No. 2, pp 48-
51.
Maarop, A.H. and Embi, M.A. (2016) “Implementation of Blended Learning in Higher Learning Institutions: A Review of
Literature”, International Education Studies, Vol 9, No.3, pp 41.
McCutcheon, K., Lohan, M., Traynor, M. and Martin, D. (2015) “A systematic review evaluating the impact of online or
blended learning vs. face-to-face learning of clinical skills in undergraduate nurse education”, Journal of advanced
nursing, Vol 72, No. 2, pp 255-270.
O'Flaherty, J., and Phillips, C. (2015) “The use of flipped classrooms in higher education: A scoping review”, The Internet
and Higher Education, Vol 25, pp 85-95.
Piccoli, G., Ahmad, R., and Ives, B. (2001) “Web-based virtual learning environments: A research framework and a
preliminary assessment of effectiveness in basic IT skills training”, MIS quarterly, Vol 25, No. 4, pp 401-426.
Russell, T.L. (1999) The no significant difference phenomenon: A comparative research annotated bibliography on
technology for distance education: As reported in 355 research reports, summaries and papers. North Carolina State
University.
Tayebinik, M., and Puteh, M. (2012) “Blended Learning or E-learning?” International Magazine on Advances in Computer
Science and Telecommunications, Vol 3, No. 1, pp 103-110.
Watkins J. and Mazur E. (2010) Just-in-time teaching and peer instruction. In Simkins, S. and Maier, ͒M. (eds.) Just-in-time
teaching: Across the disciplines, and across the academy, Stylus Publishing, Sterlin, VA.
Williams, M. D. (1996) "Learner-Control and Instructional Technologies," in Handbook of Research for Educational
Communications and Technology, D. H. Jonassen (ed.), Simon and Schuster Macmillan, New York.

671

View publication stats

You might also like