Professional Documents
Culture Documents
org
DOI: 10.13189/ujer.2020.082546
Received September 6, 2020; Revised November 17, 2020; Accepted November 29, 2020
Abstract This study aim is to examine and evaluate research of its kind from first hand.
the impact of COVID-19 on education systems, which is
Keywords COVID-19, Online Learning, Acceptance,
the implementation of online learning (OL) during
Usability, Satisfaction, Technical Competencies, Smart
Movement Control Order (MCO) to higher learning
PLS
education institution in Malaysia. A questionnaire was
developed, and a random sampling of data was performed.
Three hundred and fifty-two samples were collected using
the Smart PLS-Structural Equation Modeling (SEM)
technique to analyze the results. The study shows that 1. Introduction
acceptance, usability, satisfaction, and technical skills play
a pivotal role in inspiring online learning effectiveness for Schools and higher education institutions have been
the institution of higher education. Such consideration challenged by significant shifts in diverse contexts due to
allows the entire higher education institution in Malaysia to the COVID-19 and Malaysia Movement Control Order
recognize the importance of online learning in education in (MCO). With modern gadgets and the Internet, students
a consistent way. All the factors influence the sustained today grow up. Its practices are also distinct from those of
positive efficacy except acceptance of online learning. The the previous generation. As such, identifying these
present study covers only the institutions of higher learning. differences and designing educational suitable for their
Entire elementary and secondary school students also learning styles, characteristics, and attitudes are
participate in online learning, so this group can also be challenges faced by educational practitioners and
included for further studies. The research would continue designers. Students’ expectation and their satisfaction
to develop the online learning environment, and the study with online learning courses have received significant
of higher education organizations is gradually embracing interest from educational practitioners. Student
learning as an alternative to conventional classroom expectations and their satisfaction with online learning
learning for current students and as a means to extend its courses have received significant interest from educational
scope to new students. Adopting online learning and its professionals and researchers. However, for all levels of
technologies requires significant investment in the staff, students’ elementary, secondary, to higher-level
resources, energy, and space that administrators and those institutions, an observational study of online learning
in educational leadership need to be justifiable. Just a few performance. Acceptance, satisfaction, accessibility, and
have explored the effect of COVID-19 on the education online learning technical competencies are the metrics
system and how online learning are conducted during used in this study for the success of online learning and
MCO and using Smart PLS is a novel concept and it is a are still to be seen at all levels of students. In previous
7608 Impact of Pandemic COVID-19 to the Online Learning: Case of Higher Education Institution in Malaysia
e-learning website. were defined for this research study such as acceptance,
usability, satisfaction, and technical competencies on
online learning.
3. Research Model One dependent variable is online learning efficacy since
the conceptual model used exploratory research. Selected
The design of the conceptual framework on exploratory for all indicators were a multi-item, five-point, bipolar
research is built based on the literature. Each has four Likert scale ranging from “strongly disagree” (1) to
independent or exogenous variables. Using Smart PLS “strongly agree” (5). For each independent variable, the
software, the conceptual model for the proposed item ratings were summed to form a summary rating scale.
framework is developed. Independent variables are the However, as this is the first study of its kind inside
main factors that will impact online learning effectiveness exploratory testing, all of the things were explicitly
in higher education institutions. The dependent variable is written for this research. A measurement model and
the effectiveness of OL which is the key factor in assessment according to Hair et al., the formal evaluation
generating quality students for higher education. This parameters follow a two-step process with separate
demographic contains only whole students for higher evaluation of measurement models and structural model.
education institutions who create applications Next, construct validity should be tested for the
The study was carried out in Malaysia on a random assessment of the reflective measurement model. The
group of 315 students. First a qualitative analysis was validity of the construct consisting of two important
performed for the higher education institution focused on components is convergent validity and discriminant
interviews and experiences with the students. In higher
validity. The convergent validity assesses factor loading,
learning institutions, a survey questionnaire was
composite reliability, and Average Variance Extracted
developed based on the conceptual framework and
(AVE) and the discriminating validity evaluates
distributed to the different students. The sample size of
cross-loading, Fornell, and Larcker and HTMT.
any statistical system has a direct effect on the statistical
Assessment of Convergent Validity Convergent validity
strength. In general, to detect the true difference, a
refers to “the extent to which a measure (indicators)
researcher needs a very broad sample size while the
positively correlates with alternative measures (indicators)
distribution of dependent variables is skewed and the
of the same construct”. Table 1 shows the specifics of
effect size is minimal. The study [16] recommended the
each construct used in the conceptual framework and
creation of a model with a sample size of 200 on average.
indicators associated with each construct are listed in the
According to [13], in the case of SEM, the sample size
column named “Item”. First the composite reliability is
parameters are influenced by four elements:
verified and any value below the threshold values of 0.7
misspecification of the model, the scale of the model,
needs to be assessed. This is established by checking the
variations from normality, and procedures for estimation.
indicator reliability values. Any values in the indicator
The influence of sample size can be marginal on the
which is below 0.4 should be removed from the constructs
potential of the model to be reliably calculated to detect
and any value between 0.4 and 0.6 can be retained if it
design error if the model incorporates all the relevant
increases the AVE and CR values. After removing the
build and indication into the theory. If the model includes
indicators from the constructs, the details of the remaining
all the theory’s relevant structures and metrics, the
constructs and their corresponding convergent validity
influence of sample size can be marginal on the ability of
scores have met the threshold values for CR > 0.7 and
the model to reliably calculate design error detection.
PLS-SEM is beneficial when used with small sample AVE > 0.5 respectively.
sizes, but some researchers abuse this advantage by
depending on extremely small samples compared to the
population. According to [9], the effect size is 84 sample 4. Result and Discussion
size for the sample size using four number of predictors,
so the total sample size used matched the study. The data 4.1. Validity and Reliability Tests
obtained from the questionnaire was entered in the
MS-Excell and stored as a.csv file while can be supported Validity and usability assessments were administered to
by various statistical tools to perform reliability testing. determine the accuracy and reliability of the questionnaire.
The factors defined based on the description in the To test the value of the Pearson correlation coefficient at a
conceptual model (Figure 1) were operationalized by four meaningful level (α) of 5 percent, a validity check was
endogenous and one exogenous and the variable carried out. For a 0.05 degree of significance, the test used
influencing the software quality. For this research analysis, one-side steps. The evaluation criteria were 10 if the
one exogenous and the variable that influenced software instruments were declared true by r count > r table, and; 2)
quality were operationalized and used to collect the data. if the instrument was declared invalid by r count< r table.
Four independent (exogenous) variables and indicators The instrument used was correct since all instruments in
Universal Journal of Educational Research 8(12A): 7607-7615, 2020 7611
the validity test had a r count meaning (α) 5 percent > r 4.2. Analysis with Smart PLS
table with a 5 percent meaning. Using the Smart-PLS method assisted by Smart PLS
The evaluation parameters were 1) if the r count >r 3.0, a model of influential design change factors was built
table declared the instrument valid; and 2) if the r count <r in the construction projects. Then, the validity and
table declared the instrument non-valid. Because all reliability of the model were evaluated. Then it tested the
instruments in the validity test had r count meaning (α) validity and reliability of the model. Convergent and
5%>r table with 5% meaning, the instrument used was discriminating validity was tested for validity, while the
valid. The reliability research used Cronbach’s alpha alpha and composite durability of Cronbach was checked
values for this study that measured the precision of the for reliability. The convergent validity rule of thumb is 1)
measuring instrument. The research instrument was either the load factor > 0.7 where the lease factor 0.6-0.7 is still
correct or reliable in qualifying since Cronbach’s alpha ideal for exploratory work, 2) the load factor > 0.6 and 3)
value was 0.891. The validity and reliability assessments the mean dependent variance (AVE) > 0.5. For design
carried out show the validity of the research instrument’s modification, Figure 2 shows the smart PLS calculation
reliability. model.
Figure 2. Measurement Model of Design Change (Source: Results Analysis with SmartPLS)
7612 Impact of Pandemic COVID-19 to the Online Learning: Case of Higher Education Institution in Malaysia
The load factor values, Cronbach’s alpha values, is seen in Table 2. Moreover discriminant validity is
composite reliability values, and AVE values are shown in shown in Table 2. Using Smart PLS 3, a structural model
Table 2. George and Mallery (2003) stated that with more was evaluated after the evaluation of the measurement
than 0.7 (alpha >0.9), Cronbach’s alpha is outstanding. In model. Direct and indirect effects were analyzed to attain
the current review, more than 0.9 is excellent. Besides this objective. Consideration of the route coefficient and
within the current study set, AVE will be equal to or the value “t” verified the hypothesis. Besides it
greater than 0.5 and composite reliability will be more investigated R-Squared (R2) and quantitative significance
than sufficient. In comparison, the discriminative validity (Q2).
Universal Journal of Educational Research 8(12A): 7607-7615, 2020 7613
4.3. Structural Model accepted from the four hypotheses tested, and the t-value
obtained shows that they are statistically important. This
We evaluate multivariate skewness and kurtosis, as research also discusses the validity and reliability of the
indicated by [14]. The findings revealed that the data measures used, and the results indicate strong convergent
obtained was not multivariate natural, the multivariate validity and discriminating validity. The satisfaction,
skewness of Mardia’s multivariate skewness (β = 5.115, usability, and technical competencies are positively
p< 0.01) and Mardia’s multivariate kurtosis (β = 62.566, related to the effectiveness of online learning for higher
p< 0.01), so we recorded the path coefficient, the standard education institution students. This is supported by [3].
deviations, t-values and p-values for the structural model They find that students were satisfied with the e-learning
using a 5,000 sample, re-sample bootstrapping method mode of teaching. Another study conducted by [27] and [1]
following the recommendations of [14]. It was also based has shown that the quality of knowledge is the most
on the suggestion of [15] that p-values are not a good significant dimension followed by the navigation of the
criterion for evaluating the validity of the experiment and e-learning system. According to [17] and [32], students’
proposed that parameters such as p-values, confidence interest in research was increased by an e-learning
ratios, and impact sizes be included in a mixture. The program or a website with a harmonious color and context
description of the parameters we have used to assess the configuration. The factor of acceptance of student for
established hypotheses as seen in Table 4. online learning negatively affects the effectiveness of
We first tested the influence of the 3 predictors on OL online learning during the MCO because the student used
efficacy. R2 was 0.633, which indicates that 63.3 percent fully online learning to study for the first time, thus most
of the variation in OL efficacy was explained by all 4 of them cannot accept this new norm. Besides that, the
predictors. Acceptance (β = -0.054, p > 0.10), Satisfaction student also may have a certain problem during online
(β = 0.255, p< 0.01), technical competencies (β = 0.261, p learning such as time management, the environment of the
< 0.10) and usability (β = 0.195, p > 0.10). H2, H3, and study, and others that have a major impact on their
H4 were all positively linked to OL efficacy, so H2, H3 acceptance of online learning. The results showed, in
and H4 were supported, but H1 are negatively related to conclusions, that all the theories are backed by empirical
the effectiveness of OL and not significant, thus H1 was research and by following per under with previous
rejected. observations and theoretical context.
5. Conclusions
This study aims is to investigate the effect of pandemic REFERENCES
COVID-19 on the education system of higher learning
[1] Alsherri, A.A., Rutter, and Smith. “An Implementation of
institutions by analyzing the hypothesized model i.e. the UTAUT Model for Understanding Students’ Perceptions
“Acceptance of Students-Effectiveness Online Learning” of Learning Management Systems: A Study Within Tertiary
in higher education institutions. Only two constructs are Institutions in Saudi Arabia”, International Journal of
7614 Impact of Pandemic COVID-19 to the Online Learning: Case of Higher Education Institution in Malaysia
Distance Education Technologies. Volume 17, Issue 3. pp [18] Hiltz, S. R. “The virtual classroom: Learning without limits
123-128, 2015. via computer networks”, Norwood, NJ: Ablex, 1994.
[2] Arbaugh, J. B., & Benbunan-Fich, R. “The importance of [19] Ibrahim R, Leng N S, Yusoff R C M, Samy G N, Masrom S,
participant interaction in online environments”, Decision Rizman Z I. “E-learning acceptance based on technology
Support Systems, 43(3): 853-865, 2007. acceptance model (tam)”, J. Fundam. Appl. Sci., 9(4S),
871-889, 2017.
[3] Ayub. N and Iqbal, S. “Student Satisfaction with e-Learning
achieved in Pakistan, Asian Journal of Distance Education”, [20] Jurczyk, J., Benson, S. K., & Savery, J. “Measuring student
The Asian Society of Open and Distance Education. ISSN perceptions in Web-based courses: A standards-based
1347-9008 Asian J D E Vol 9, No 2, pp 26 – 31, 2011. approach”, Online Journal of Distance Learning
Administration. Retrieved November, 17, 2009, from:
[4] Bataineh, E. A summary look at Internet based distance http://www.westga.edu/~distance/ojdla/winter74/jurczyk74.
education.2001 htm
[5] Bates, A.W., & Bates, T. “Technology, E-learning and [21] Johnson, S. D., S. R. Aragon, N. Shaik, and N. Palma-Rivas.
Distance Education”, Psychology Press, 2005 Com-parative analysis of learner satisfaction and learning
outcomes in online and face-to-face learning environments.
[6] Benbunan-Fich, R., Hiltz, S.R., and Harasim, L. “The Online Journal of Interactive LearningResearch11 (1): 29–49. 2000.
Interaction Learning Model: An Integrated Theoretical
Framework for Learning Networks”, 2005. [22] Kiget, N.K., Wanyembi, G. and Peters, A.I. “Evaluating
Usability of E-Learning Systems in Universities.”,(IJACSA)
[7] Büyüközkan, G., Ruan, D., & Feyziog lu, O. “Evaluating International Journal of Advanced Computer Science and
e-learning Website quality in a fuzzy environment”, Applications, Vol. 5, No. 8,pp. 97-118, 20.14
International Journal of Intelligent Systems. 22, 567–586,
2007. [23] Khan,B.H. “A Framework for Web-Based Learning”. In B.H.
Khan (Ed), Web-based Traning. Englewood Cliffs, NJ:
[8] Chiu, C., Hsu, M., & Sun, S. “Usability, quality, value and Educational Technology Publications (2001).
e-learning continuance decisions”, Computers & Education,
45, 399-416, 2005. [24] Klobas, J. E., & Clyde, L. A. “Learning to use the internet in
a developing country: Validation of a user model”, Libri,
[9] Green, S.B. “How many subjects does it take to do a 48(3), 163-175, 2010.
regression-analysis”, Multivariate Behavioural Research,
Vol. 26 No. 3, pp. 499-510 1991. [25] Klobas, J. E., & McGill, T. J. “The Role of Involvement in
Learning Management System Success”, Journal of
[10] Govindasamy, T. “Successful im-plementation of Computer High Education, 22, 112-134, 2010.
e-Learning; Pedagogical considerations”. The Internet and
Higher Education, 4(3-4), 287-299, 2002. [26] Kort, W., & Gharbi, J. (na). “An experiential approach of
satisfaction in e-learning”, Retrieved July 5, 2008, from
[11] Harasim, L. “Shift Happens Online Education as a New http://www.canavents.com/its2008/abstracts/283.pdf
Paradigm in Learning”, Internet and Higher Education, Vol
(3), pp 41-61, 2000. [27] Lee, B., Yoon, J., & Lee, I. “Learners’ acceptance of
e-learning in South Korea: Theories and results”, Computers
[12] Hart, C. Factors Associated with Student Persistence in an & Education, 53, 1320–1329, 2009.
Online Program of Study: A Review of the Literature.
Journal of Interactive Online Learning, Volume 11, Number [28] Mohamed Hussain Thowfeeka, M.H., and Abdul Salamb,
1, pp 19-42. M.N. “Students’ Assessment on the Usability of E-learning
Websites”, Procedia - Social and Behavioral Sciences No.
[13] Hair, J., Anderson, R., Tatham, R.L., & Black, W.C. 141.pp 916 – 922, 2014.
Multivariate data analysis, (5th ed.), NJ: Upper Saddle River,
Prentice-Hall. 1998 [29] Moore, M.G. “Three Types of Interaction”, American
Journal of Distance Education, No.3, Vol 2, pp. 1-7, 1989.
[14] Hair, J. F., Hollingsworth, C. L., Randolph, A. B., & Chong,
A. Y. L. An updated and expanded assessment of PLS-SEM [30] Neuhauser, C. “Learning Style and Effectiveness of Online
in information systems research. Industrial Management & and Face-to-Face Instruction”, American Journal of
Data Systems, 117(3), 442–458, 2017a. Distance Education. No.2, Volume 2. Pp. 99-113, 2002.
[15] Hahn, E.D. and Ang, S.H. “From the Editors: New [31] Quyen Le Hoang Thuy To Nguyen, Phong Thanh Nguyen,
Directions in the Reporting of Statistical Results in the Vy Dang Bich Huynh, Luong Tan Nguyen (2020).
Journal of World Business. Journal of World Business, 52(2), Application Chang's Extent Analysis Method for Ranking
pp 125-125, 2017. Barriers in the E-Learning Model Based on
Multi-Stakeholder Decision Making. Universal Journal of
[16] Hoelter, J. W. (1983). The Analysis of Covariance Structures: Educational Research, 8(5), 1759 - 1766. DOI:
Goodness-of-Fit Indices. Sociological Methods & Research, 10.13189/ujer.2020.080512.
11, 325-344, 1983.
[32] Ramayah, T., Hwa, C.J. & Chuah, F. “Partial Lease Square
[17] Hong, K.S., Lai, K.W., & Holton, D. (2003). Students' Structural Equation Modeling (PLS-SEM) using SmartPLS
satisfaction and perceived learning with a Webbased course. 3.0: An Update and Practical Guide to Satistical Analysis”,
Journal of Educational Technology & Society 6(1). Pearson Singapore, 2018.
Retrieved January 15, 2004, fromhttp://ifets.ieee.org/period
ical/vol_1_2003/v_1_2003.html. [33] Rafaeli, S. and F. Sudweeks. `Networked Interactivity',
Universal Journal of Educational Research 8(12A): 7607-7615, 2020 7615
Journal of Computer-Mediated Communication 2(4), 1997. Social Presence”, Journal of Agricultural Education.
Volume 53, Number 3, pp 98–110, 2012.
[34] Rosenberg, M.J. “E-Learning: Strategies for Delivering
Knowledge in the Digital Age”, McGraw-Hill, New York, [39] Taylor, J. C. (2001). Fifth generation distance education.
2001. e-Journal of Instructional Science and Technology (e-JIST),
4(1), 1-14, 2001.
[35] Russell, L.B. Modelling for Cost-Effectiveness Analysis.
Statistic in Medicine. [40] Umi Kalsom Masrom, Nik Aloesnita Nik Mohd Alwi, Nor
Hazlin Nor Asshidin (2019). Understanding Learners'
[36] Smith G. G., Ferguson, D. & Caris M. “Online vs Satisfaction in Blended Learning among Undergraduate
face-to-face”, T H E Journal 28 (9), pp. 18-25, 2001. Students in Malaysia. Universal Journal of Educational
Research, 7(10), 2233 - 2238. DOI: 10.13189/ujer.2019.071
[37] Sigala, M. Investigating the factors determining e-learning 023.
effectiveness in tourism and hospitality education. Journal
of Hospitality & Tourism Education, Vol. 16, No. 2, pp. 11 – [41] Zawaideh, F.H.” Acceptance Model for e-Learning Services:
21, 2004. A Case Study at Al-Madinah International University in
Malaysia”, International Journal of Academic Research in
[38] Strong, R. “Investigating Students’ Satisfaction with Accounting, Finance and Management Sciences Vol. 7,
eLearning Courses: The Effect of Learning Environment and No.2, April 2017, pp. 14–20, 2017.