You are on page 1of 7

FALL 2020/WINTER 2021 | VOLUME 3 | ISSUE 2

F E AT U R E A R T I C L E

Virtual Learning Assessment:


Practical Strategies for Instructors
in Higher Education
Kenneth Denton, Department of Psychology, West Texas A&M University
Michelle Simmons, Department of Education, West Texas A&M University
https://doi.org/10.36896/3.2fa3
ABSTRACT
Many universities have been designing and implementing online instruction for years, but the COVID-19
pandemic created an unexpected impetus for the expansion of virtual learning. Instructors and students who
may not have previously chosen or experienced online instruction found themselves in need of safe, virtual
options, and it appears that the general shift to virtual learning is here to stay. Strong, reliable assessment is a
major component of virtual instruction, and instructors have several options for structuring student feedback.
This article reviews the relevant literature regarding effective online assessment techniques and makes
recommendations for the use of examinations and more authentic assessments, including video demonstrations,
group projects, and discussion forums. Various data analytics within Learning Management Systems (LMS) are
also explored. Discussion includes implications of online assessment and avenues for important research to
strengthen response to this growing need.

Keywords: virtual learning, virtual instruction, higher education institutions, online assessment

A s Spring 2020 arrived, many could


not anticipate the challenges on the
horizon. The emergence of a pandemic,
COVID-19, presented novel situations never
before faced: personally, professionally, and
setting is characterized by varied and diverse
instructional methods. A review of previous
research indicates that the process of
learning online is typically categorized by the
instructional delivery of content and student
economically. Typical, familiar routines were feedback (Gaytan & McEwen, 2007; Sieber,
going to change, and education was not spared 2005). Traditionally, instruction in higher
this new reality. While businesses closed and education occurs in a face-to-face setting,
homes became office spaces and schools for whereas online instruction is becoming a
many families, education systems navigated more common and widely accepted approach
new territory in virtual instructional delivery. (Barnard-Brak & Shiu, 2010). In light of this
Virtual instructional delivery impacted rapid and recent demand for the use of virtual
educational settings from kindergarten instructional technology at institutes of higher
through university-level institutions (Gonzalez education nationwide, this article seeks to
et al., 2020; Marshall et al., 2020). Virtual offer support to instructors responsible for
delivery of instruction at institutes of virtual instructional delivery. The following
higher education was not a new concept demonstration of promising strategies for
before COVID-19; however, the widespread virtual learning assessment emphasizes the
transition of a great number of students was critical role assessment plays in postsecondary
unprecedented. In an extremely short time, student performance. The use and modification
a vast number of instructors were faced with
the challenge of making a novice shift to the Corresponding Author
use of online learning management systems Kenneth Denton, PhD, Department of Psychology
(LMS) and virtual learning strategies (Perrotta West Texas A&M University
& Bohan, 2020). 720. S. Tyler Street | Amarillo, TX 79101
Virtual instruction in a higher education Email: kdenton@wtamu.edu

42
JOURNAL OF COLLEGE ACADEMIC SUPPORT PROGRAMS

of traditional exams and more authentic Figure 1


assessments, such as video demonstrations, Tips for Effective Online Assessment
group projects, and discussion forums, are Type of Assessment Strategies
discussed. Authors provide recommendations
Exams, General ● Limit access to exam items
for practical strategies to incorporate the ● Use virtual proctoring (i.e., lock-down
effective use of online assessment in LMS. browsers or video surveillance devices
or systems)
O n l i n e As sessmen t in a Vir t u al ● Use open-ended essay questions
● Consider open-resource exams
L e a rnin g E n v ir o n men t
Varied and diverse instructional Exams, Multi- ● Use random item pools for exam ques-
methods characterize virtual instruction in the ple-Choice/Selection tion selection
● Set time restrictions
higher education setting. Studies of student ● Allow multiple trials
perspectives regarding face-to-face versus ● Reduce the weight of exam grades
online instruction indicate that students prefer Authentic Assess- ● Provide structured guidelines for self
direct interaction, real-time communication ment, Video Demon- and peer reflection/feedback
with the instructor, the use of a variety of strations ● Provide and clearly communicate
instructional methods, and responsive learning rubrics prior to submission of assign-
strategies during the initial delivery of course ments
content (Gaytan & McEwen, 2007; Sieber, Authentic Assess- ● Create collaborative, field-based
2005). An online learning environment has ment, Group Projects products
● Provide instructor and peer feedback
been relevant and critical across institutes of toward goals
higher education for more than 10 years (Corey ● Include anonymous or confidential
& Ben-Porath, 2020; Gaytan & McEwen, 2007; peer reviews of contributions
Sieber, 2005; Young, 2006). Undergraduate ● Provide clear project rubrics
and graduate students have previously chosen Authentic Assess- ● Introduce concrete questions
to engage in online learning due to location, ment, Discussion ● Use field-based examples to generate
employment schedules, or a preference Forums discussion
● Allow students to grapple with ques-
for online instructional delivery (Gaytan & tions that may not have an answer
McEwen, 2007). However, students who may ● Provide a resource for a common con-
not have selected online learning as a preferred cern in each forum to focus the discus-
instructional approach previously have been sion on collaborative problem-solving
● Provide a moderate level of direct feed-
forced into this learning environment due to back regarding discussion posts
COVID-19 (Perrotta & Bohan, 2020). This shift ● Encourage peer responses
necessitates instructor competence in a virtual Note. Figure 1 adapted from Barry, 2012; Gaytan & McEwen,
learning environment. Previous researchers 2007; Harmon et al., 2010; Salter & Conneely, 2015; Schultz
have argued that online assessment can be & Quinn, 2014; Sieber, 2005; Talley & Scherer, 2013; Young,
more challenging than traditional assessment 2006.
formats due to the demand for innovation
and the absence of human interaction. For Exams
instance, Gaytan and McEwen (2007) found Exams and quizzes are often one of
that “using effective assessment techniques the first options considered by instructors for
is an essential part of effective teaching the assessment of knowledge and skills when
and learning in the electronic environment” developing an online course. Historically,
(p. 118). Therefore, instructors can greatly student performance in online courses has
improve online instruction and learning by been evaluated primarily through the use of un-
developing assessments that encourage proctored, multiple-choice exams (Harmon et
reflection and meaningful engagement with al., 2010). Despite the relative objectivity and
materials. The following discussion outlines longstanding history of these exams, instructors
virtual assessment formats and establishes should look for ways to provide more authentic
online assessment guidelines for higher and interactive assessments of knowledge and
education course integration. See Figure 1 for skills (Kim et al., 2008; Shaw, 2019).
an outline of helpful strategies for including A major concern with online exams relates
assessment in online higher education courses. to integrity and academic dishonesty (Harmon
In addition to the thoughtful use of exams and et al., 2010). There is an ongoing investigation
authentic assessment strategies listed, authors in the literature regarding cheating in online
encourage instructors to consider leveraging courses, with differing results when comparing
data-collection features embedded in LMS as rates, motivations, and types of cheating online
a part of online assessment. as opposed to face-to-face classes (e.g., Stowell
& Bennett, 2010; Watson & Sottile, 2010).

43
FALL 2020/WINTER 2021 | VOLUME 3 | ISSUE 2

Despite these findings, researchers generally several factors beyond online vs. face-to-face
agree that online exams present an increased administration and that limited virtual exam
risk of cheating, particularly exams that are administration time may reduce student anxiety,
not proctored (Harmon et al., 2010; Watson depending upon student course expectations.
& Sottile, 2010). Several adjustments improve In addition, allotting more time for exams
online exams: limiting access to exam items or does not seem to lead to better performance
questions, reducing the weight of exam grades, (Portolese et al., 2016). Instead of allowing
and virtual proctoring, including the use of lock- students an unspecified amount of time to
down browsers or video surveillance systems. complete exams, instructors are encouraged to
Virtual proctoring strategies are specifically allow students to take exams or quizzes multiple
recommended for exam integrity (Harmon et times (two or more trials depending on the size
al., 2010) but are not always practical and could of the question pool). Doing so may reduce the
conflict with student needs or motivations for burden associated with time restrictions, with
enrolling in online courses (Sieber, 2005). the added benefit of encouraging self-regulated
Since unproctored multiple-choice learning. Namely, instructors can recommend
exams are practical and likely to be very or require students to take an exam once, early
prevalent in online courses, within a module, after exposure to
instructors might consider three relevant material. Based on initial
sensible strategies to improve ...instructors trial results, students can identify
these assessments: random item areas of course content weakness,
pool selection, time restrictions, can greatly target concept development for
and multiple trials. First, well- the study, and make an additional
developed online question pools improve online attempt thus allowing formative
can allow for better content evaluation through ongoing
validity and comprehensive instruction progress feedback (Kim et al.,
assessment of knowledge. To 2008).
further improve this type of and learning One further strategy is
assessment, instructors should to shift from selection-based
review item pools and select well- by developing (i.e., multiple-choice, true-false)
written questions that accurately exams altogether. Open-answer
represent their course objectives; assessments or essay assessments encourage
this is especially true when using access to higher-order learning
publisher-provided multiple- that encourage and knowledge, which is also
choice question pools. In this case, more appropriate for open-
instructors should consider adding reflection and source administration. Essay or
self-written questions to meet a short-answer exam responses
predetermined ratio. For example, meaningful allow assessment of writing and
if a publisher question pool has concept development beyond
40 relevant questions, instructors engagement more basic recognition tasks,
may add 10 instructor-developed such as analyzing an argument
questions to reach a goal of 20% with materials. or compare/contrast responses.
course-specific items. The exam In addition, writing samples can
can then be set to randomly select items from be pulled from previous products such as
the larger question pool (Shuey, 2002, as cited personal journals, online portfolios, or more
in Harmon et al., 2010). This random selection collaborative, group writing (e.g., Google Docs)
of questions is a valuable feature in online to allow for analysis of progress and formative
courses and mimics shuffling strategies proven assessment as well (Kim et al., 2008). Myyry
effective in face-to-face courses (e.g., multiple and Joutsenvirta (2015) reported that although
paper versions, etc.; Harmon et al., 2010). open-source exams do not directly correlate
To further protect exams and ensure to learning outcomes, online access to sources
a more accurate evaluation of knowledge, (i.e., textbook, web-based) reduces anxiety and
instructors should limit exam duration. In most improves self-regulated learning. Open-source
cases, this can reduce cheating opportunities exams also emulate authentic, field-based
or unauthorized peer collaboration, especially scenarios and professional applications. In
in conjunction with random pool selection. other words, real-world scenarios more often
Time limits are not a perfect solution and may involve collaboration and collection of sources
be overwhelming to some students. However, and information to address questions and
Stowell and Bennett (2010) determined problems in the field instead of accurate, in-
that student test anxiety is dependent on
44
JOURNAL OF COLLEGE ACADEMIC SUPPORT PROGRAMS

the-moment recall. If instructors have academic given ongoing progress feedback (Tsai, 2013).
integrity concerns, most LMS have the ability In this way, instructors effectively develop
to review student feedback through plagiarism- practical, professional skills by designing
checking software (e.g., Turnitin; Watson & group collaborations to mimic professional
Sottile, 2010). products (e.g., lesson plans for teachers).
Authentic Assessment Additionally, group projects reduce instructor
Shaw (2019) defined authentic assessments grading demands, which can be especially
as “creative learning experiences to test students’ helpful when providing feedback on lengthy
skills and knowledge in realistic situations,” which writing assessments. Current and emerging
require the application of student learning. Authentic platforms (e.g., Google Docs, Dropbox Paper,
assessments bridge student critical thinking Remind or social media) allow student groups
capacity and are best used in conjunction with to write, edit, comment, and provide dynamic
more “traditional” assessment approaches (Kim peer assistance to develop papers, essays, and
et al., 2008). As such, authentic assessment can presentations. Instructors can use these same
be applied in many creative ways in a virtual editing and collaborative features to check for
setting. understanding and provide more frequent or
Video Demonstrations timely feedback.
Student-developed video is an authentic Although some students may dislike group
assessment strategy that facilitates collaboration assignments, online students generally perceive
and peer-assisted learning. Video demonstrations them as do students in face-to-face classes
involve the recording of mock or actual (Johnson, 2006). Students view the modeling of
professional skills and are often used in structured and timely communication and use
healthcare or mental health fields for clinical of meaningful examples as particularly effective
skills and psychological assessments (e.g., practices in online courses (Young, 2006), and
Roberts & Davis, 2015; Seif et al., 2013). these can help to orient students to group
Video presentations challenge students to expectations as well. Using rubrics for group
synthesize and integrate knowledge through projects helps instructors set clear expectations
field-based applications (Barry, 2012; Talley & and encourages timely feedback on progress
Scherer, 2013). LMS and online resources for (Gaytan & McEwen, 2007). To be effective,
video recording and collaboration (e.g., Zoom, rubrics should define the project tasks that are
YouTube, VoiceThread, etc.) are becoming relevant to practice, tie to learning objectives,
more available and accessible to students and and clearly state the criteria used for evaluation
universities. These platforms make it easy for (Shaw, 2019). Rubrics also ensure common goals
students to create and submit high-quality and guide the division of responsibilities within
recordings and presentations with most cell groups. Similarly, it is reasonable that student
phones or laptops. Both synchronous and concern could exist regarding the assignment
asynchronous video delivery increases student and grading equality among group members.
engagement through thoughtful review, self To combat this concern, authors suggest that
and peer evaluation, instructor evaluation, instructors create opportunities for anonymous
and feedback. When paired with structured or confidential student self and peer feedback
guidelines for self and peer reflection, student regarding their group contributions. If well
videos can be an effective strategy for self- structured, this feedback may be used as a
regulated learning and improve positive learning portion of the project grade or included as a
behaviors (Barry, 2012; Schultz & Quinn, separate course participation grade.
2014; Talley & Scherer, 2013). Additionally, Discussion Forums
assessment of student videos must utilize Student discussion forums are another
rubrics that are clearly communicated for both common, well-documented strategy to evaluate
the instructor’s assessment of knowledge and virtual student progress and knowledge
skills as well as for the student’s self- and peer- acquisition (Balaji & Chakrabarti, 2010; Kim et
review (Barry, 2012; Schultz & Quinn, 2014). al., 2008). Discussion forums are a convenient
Group Projects assessment tool that allows all students to
Group projects, another authentic socially engage in course content, which is
assessment technique, develop transferable otherwise difficult in face-to-face classes due to
skills in collaboration and communication. time and space constraints or social dynamics.
Through developing a collaborative product, Student perceptions and engagement levels
students share knowledge and provide peer during online discussion forums are varied;
evaluation and feedback. Group projects however, it is apparent that structure and clear
cultivate self-regulated learning and real-world expectations are critical (Balaji & Chakrabarti,
skills and are most effective when groups are 2010; Salter & Conneely, 2015).
45
FALL 2020/WINTER 2021 | VOLUME 3 | ISSUE 2

Discussions designed to target student- learning and its relationship to successful


centered, field-based problem solving improve learning outcomes. As an additional graded
course performance and learning (Stockwell metric, students can be asked to self-report
et al., 2015). Instructors should pose concrete reading goals and time spent reading for each
questions, use field-based examples to generate course module through a course survey. Student
discussion, and, when possible, allow students self-reporting strategies evaluate important
to grapple with questions that may not have metrics of student learning and support self-
an answer (Salter & Conneely, 2015). Further regulated learning and success in all courses.
support may be given by providing a video link In direct connection to recent, signif-
or article about a common concern or debate icant institutional changes due to COVID-19,
in each forum to help focus the discussion instructors should consider using LMS features
on collaborative problem-solving. Finally, and technology to support students who may
instructors should strive to provide a moderate be unfamiliar with virtual learning. Student vir-
level of direct feedback regarding discussion tual learning experiences vary; student expec-
posts and encourage peer responses (Balaji & tations and orientations toward learning drive
Chakrabarti, 2010; Salter & Conneely, 2015). virtual course outcomes (Johnson, 2006; Yur-
These strategies have encouraging dugul & Menzi Cetin, 2015). There-
results, even in courses that are fore, it is critical to establish clear
often viewed as involving more
technical skills, such as statistics
...future student expectations early in the
course during the virtual learning
(Everson & Garfield, 2008). When process (Sieber, 2005). Additional-
structured in this way, McDougall research ly, students have expressed a sense
(2015) found virtual discussion of social disconnection from course
forums encouraged respectful
disagreement and conceptual
should provide instructors and classmates in online
courses (Plante & Asselin, 2014).
development that students found Online databases, “copy” options,
authentic and beneficial. In insight into the and other LMS embedded technol-
addition, discussion forums allow ogy features can address perceived
for further assessment of writing
development and skills, similar best type of communication and social barriers
in a virtual learning environment.
to essay prompts on exams. Instructors are encouraged to de-
Importantly, though, discussion feedback and velop step-by-step instructions with
forums seem to build writing screenshots/images, aiding student
skills and empowerment through orientation to course assessments.
self-reflection and use of peer its outcomes Students report positive reactions
models in written posts (Salter toward increased instructor guid-
& Conneely, 2015). Sieber (2005)
also recommended that students
on student ance and value the use of LMS fea-
tures that facilitate efficient and
should be required to write personalized communication, such
discussion posts in a professional performance. as the calendar, chat, and notifi-
communication style, which will cation features (Lonn & Teasley,
further develop practical writing 2009; Young, 2006).
skills and reinforce the relevance of these
assessments. Conclusion
Leveraging LMS Features for Assessment The current demand for virtual assess-
It is highly recommended that instructors ment in technology-mediated systems is high.
take advantage of monitoring technology Institutes of higher education can improve
available in many LMS. Metric tracking can be students’ virtual learning experiences through
used and reviewed for engagement of course strategic assessment. Robust assessment does
content. You (2016) noted that student self- not rely on a single format but rather employs
regulated learning behaviors (e.g., frequency of multiple metrics to measure student progress
logins, time spent in content areas, number of and knowledge acquisition. Instructors should
assessment attempts, etc.) predicted success continue to seek authentic assessments that
in an online course. In contrast to course engage students in collaborative problem-solv-
attendance or participation, instructors can ing and provide opportunities for peer-assisted
monitor and grade students’ self-regulated learning and peer evaluation through the de-
learning behaviors through LMS evaluation velopment of field-based products (Kim et al.,
metrics. By doing so, instructors communicate 2008). When instructors provide timely student
to students the importance of self-regulated feedback and continual progress communica-
46
JOURNAL OF COLLEGE ACADEMIC SUPPORT PROGRAMS

tion throughout a course, students are most References


successful (Barry, 2012; Tsai, 2013). Balaji, M., & Chakrabarti, D. (2010). Student
Innovative technology and media are interactions in online discussion forum:
continuously being developed and implemented Empirical research from ‘media richness
in education. An exhaustive list of innovative theory’ perspective. Journal of Interactive
assessment strategies is beyond the scope of Online Learning, 9(1), 1–22.
this article. However, the nature of virtual Barnard-Brak, L., & Shiu, W. (2010). Classroom
learning environments, especially in the Community Scale in the blended learning
wake of COVID-19, is of notable significance environment: A psychometric review.
and elicits additional ethical challenges for International Journal of E-Learning, 9(3),
privacy, fairness, and equity. Instructors should 303–311. https://www.learntechlib.org/
check with their institution’s instructional primary/p/30426/
technology policies when selecting class Barry, S. (2012). A video recording and viewing
platforms outside of university-adopted LMS. protocol for student group presentations:
Many online platforms could present a privacy Assisting self-assessment through a Wiki
or security risk, especially when relying on environment. Computers & Education,
personal accounts. Even with clear instructions, 59, 855–860. https://doi.org/10.1016/j.
instructors may be unable to monitor access compedu.2012.04.008
or inappropriate interactions. Additionally, Corey, D., & Ben-Porath, Y. (2020). Practical
instructors face the challenge of creating guidance on the use of the MMPI
virtual content and environments that are instruments in remote psychological
accessible for an increasing group of students testing. Professional Psychology: Research
who may not have otherwise opted for online and Practice, 51(3), 199–204. http://dx.doi.
instruction. Thus, instructors should carefully org/10.1037/pro0000329
consider course expectations for assessment Everson, M., & Garfield, J. (2008). An innovative
strategies based on student population, such approach to teaching online statistics
as student demographics, access to consistent courses. Technology Innovations in Statistics
and stable internet, computer resources, fees Education, 2(1), 1–18. https://escholarship.
for software programs, student ability to reply org/uc/item/2v6124xr
or respond within designated time frames, Gaytan, J., & McEwen, B. (2007). Effective online
etc. Many strategies for online assessment instructional and assessment strategies.
may also make it difficult to provide necessary The American Journal of Distance
accommodations for disabilities or cultural and Education, 21(3), 117–132. https://doi.
language differences. org/10.1080/08923640701341653
Finally, despite these promising Gonzalez, T., Rubia, M. A., Hincz, K. P, Comas-
strategies for improved assessment methods, Lopez, Subirats, L., Fort, S., & Sacha, G. M.
there are still avenues for further exploration (2020). Influence of COVID-19 confinement
in virtual instruction. Prompt and supportive on students’ performance in higher
feedback is critical (Mann, 2014; Plante & education. PLOS ONE, 15(10), 1–23. https://
Asselin, 2014); however, future research should doi.org/10.1371/journal.pone.0239490
provide insight into the best type of feedback Harmon, O., Lambrinos, J., & Buffolino, J. (2010).
and its outcomes on student performance. Also, Assessment design and cheating risk
it is not clear if students provide open, unbiased in online instruction. Online Journal of
feedback regarding self and peer contributions Distance Learning Administration, 13(3).
and progress in a virtual learning environment, https://www.westga.edu/~distance/ojdla/
even when the avenue for feedback is Fall133/harmon_lambrinos_buffolino133.
anonymous or confidential. It would be helpful html
to know how feedback improves engagement Johnson, G. (2006). College student psycho-
and performance and how assessment modality educational functioning and satisfaction
impacts these outcomes. For example, a sparsity with online study groups. Educational
of data exists concerning the relationship of Psychology, 26(5), 677–688. https://doi.
multiple assessment trials for multiple-choice org/10.1080/01443410500390848
exams and student performance outcomes, Kim, N., Smith, M., & Maeng, K. (2008). Assessment
content knowledge, or acquisition of positive in online distance education: A comparison
self-regulated learning behaviors. Greater of three online programs at a university.
exploration of multiple assessment trials Online Journal of Distance Learning
in online environments could contribute to Administration, 11(1). https://www.
improved student outcomes. westga.edu/~distance/ojdla/spring111/
kim111.html
47
 

FALL 2020/WINTER 2021 | VOLUME 3 | ISSUE 2

Lonn, S., & Teasley, S. (2009). Saving time Seif, G., Brown, D., & Dusti, A. (2013). Video-
or innovating practice: Investigating recorded simulated patient interactions:
perceptions and uses of learning Can they help develop clinical and
management systems. Computers & communication skills in today’s learning
Education, 53(3), 686–694. https://doi. environment? Journal of Allied Health,
org/10.1016/j.compedu.2009.04.008 42(2), 37–44.
Mann, J. (2014). A pilot study of RN-BSN completion Shaw, A. (2019). Authentic assessment in the
students’ preferred instructor online online classroom. Wiley. https://ctl.wiley.
classroom caring behaviors. The ABNF com/authentic-assessment-in-the-online-
Journal, 25(2), 33–39. classroom/
Marshall, J., Roache, D., & Moody-Marshall, Sieber, J. (2005). Misconceptions and realities
R. (2020) Crisis leadership: A critical about teaching online. Science and
examination of educational leadership Engineering Ethics, 11(3), 329–340.
in higher education in the midst of the Stockwell, B., Stockwell, M., Cennamo, M., & Jiang,
COVID-19 pandemic. International Studies E. (2015). Blended learning improves science
in Educational Administration, 48(3), 30–37. education. Cell, 162(5), 933–936. http://
McDougall, J. (2015). The quest for authenticity: A dx.doi.org/10.1016/j.cell.2015.08.009
study of an online discussion forum and the Stowell, J., & Bennett, D. (2010). Effects of online
needs of adult learners. Australian Journal testing on student exam performance
of Adult Learning, 55(1), 94–113. and test anxiety. Journal of Educational
Myyry, L., & Joutsenvirta, T. (2015). Open- Computing Research, 42(2), 161–171.
book, open-web online examinations: https://doi.org/10.2190/EC.42.2.b
Developing examination practices to Talley, C., & Scherer, S. (2013). The enhanced
support university students’ learning and flipped classroom: Increasing academic
self-efficacy. Active Learning in Higher performance with student-recorded
Education, 16(2), 119–132. https://doi. lectures and practice testing in a “flipped”
org/10.1177/1469787415574053 STEM course. Journal of Negro Education,
Perrotta, K., & Bohan, C. H. (2020). A reflective 82(3), 339–347. https://doi.org/10.7709/
study of online faculty teaching experiences jnegroeducation.82.3.0339
in higher education. Journal of Effective Tsai, C. (2013). An effective online teaching method:
Teaching in Higher Education, 3(1), 50–66. The combination of collaborative learning
https://jethe.org/index.php/jethe/article/ with initiation and self-regulation learning
view/9 with feedback. Behaviour & Information
Plante, K., & Asselin, M. (2014). Best practice Technology, 32(7), 712–723. https://doi.or
for creating social presence and caring g/10.1080/0144929X.2012.667441
behaviors online. Nursing Education Watson, G., & Sottile, J. (2010). Cheating in the
Perspectives, 35(4), 219–223. digital age: Do students cheat more in
Portolese, L., Krause, J., & Bonner, J. (2016). Timed online classes? Online Journal of Distance
online tests: Do students perform better Learning Administration, 13(1).
with more time? American Journal of You, J. (2016). Identifying significant indicators
Distance Education, 30(4), 264–271. https:// using LMS data to predict course
doi.org/10.1080/08923647.2016.1234301 achievement in online learning. Internet
Roberts, R., & Davis, M. (2015). Assessment for and Higher Education, 29, 23–30. http://
a model for achieving competency in dx.doi.org/10.1016/j.iheduc.2015.11.003
administration and scoring of the WAIS- Young, S. (2006). Student views of effective
IV in post-graduate psychology students. online teaching in higher education. The
Frontiers in Psychology, 6, 1–8. https://doi. American Journal of Distance Education,
org/10.3389/fpsyg.2015.00641 20(2), 65–77. https://doi.org/10.1207/
Salter, N., & Conneely, M. (2015). Structured and s15389286ajde2002_2
unstructured discussion forums as tools Yurdugul, H., & Menzi Cetin, N. (2015).
for student engagement. Computers in Investigation of the relationship between
Human Behavior, 46, 18–25. http://dx.doi. learning process and learning outcomes in
org/10.1016/j.chb.2014.12.037 E-learning environments. Eurasian Journal
Schultz, P., & Quinn, A. (2014). Lights, camera, of Educational Research, 58, 57–74. http://
action! Learning about management dx.doi.org/10.14689/ejer.2015.59.4
with student-produced video
assignments. Journal of Management
Education, 38(2), 234–258. https://doi.
org/10.1177/1052562913488371
48

You might also like