Professional Documents
Culture Documents
KSA Proctoring
KSA Proctoring
A R T I C L E I N F O A B S T R A C T
Keywords: Assessment is an important aspect of learning. It is done to find out what each student learned. The adoption of
E-Exam valid assessment techniques will help to improve the quality of the educational process. It will allow students to
Questionnaire preparation determine whether or not they achieved their learning goals. Due to this importance, students are assessed in a
Exam implementation
variety of ways. E-Exams are an example of such a technique. Students in university courses must be examined in
ANOVA analysis
SEM analysis
a short period through mid-examinations and final exams. This necessitates the use of a quick evaluation pro
cedure to yield rapid results. This research work aims to analyze the student perceptions towards the E-Exam
system in higher education in Saudi Arabian universities. The study of the E-Exam system is primarily focused on
the following subjects: (a) E-Exam implementation (b) E-Exam environment Factors (c) Student Experience Based
on E-Exam Attempts (d) E-Exam System Challenge. (e) Recommendations for E-Exam requirements. The sug
gested work is divided into three stages: The first step begins with the preparation of a questionnaire that has
explicitly addressed the impact of the aforementioned motivations on learners. The data presented in the
questionnaire were intimately associated with KSA university students. The developed questionnaire was
distributed to students as well as research scholars at KSA universities in the second phase. All of the questions in
the questionnaire are made mandatory to fill. In the third step, the responses of concerned individuals from Saudi
Arabian universities are analyzed. The evaluation has been made with ANOVA and SEM analysis.
* Corresponding author.
E-mail address: g.govindasamy@seu.edu.sa (G. Gokulkumari).
https://doi.org/10.1016/j.advengsoft.2022.103195
Received 21 April 2022; Received in revised form 8 July 2022; Accepted 23 July 2022
Available online 17 August 2022
0965-9978/© 2022 Elsevier Ltd. All rights reserved.
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
2
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
that online tests were regarded to offer substantial advantages over H0 ◊There is no significant difference between the perceptions of
conventional paper-based exams, such as grading consistency in terms of students’ interest in E-Exam.
duration, energy, and expenses incurred mostly in the exam process. H A◊ The factors like security, privacy, and confidentiality have a
Participants, on the other hand, highlighted several difficulties with the significant effect on E-Exam.
effective deployment of online tests, including concerns of confidenti H0 ◊There is no significance between the number of attempts and
ality, authenticity, and fairness. The data further suggested that E-Exams experience with student satisfaction.
were better suited to summative assessment, which was used to measure H A ◊ The challenges of E-Exam are significant effects with the E-
learning rather than summative evaluation. Online examinations must Exam system.
always be designed to be legitimate, dependable, trustworthy, and
adaptable to be implemented successfully. Compare to paper-based examinations E-Exams are effective and
In 2018, Fluck et al. [3] used a case study analysis; at “FUT Minna reduce the workload of the faculty members. The above four hypotheses
Nigeria with one at the University of Tasmania in Australia” on E-Exam. are performed and each gives efficient results.
Considering various approaches to question kinds, cohort size, tech
nology employed, and security features, the responsibilities supported 3.2. Methodology
or impeded by each of the two E-Examination systems were also con
trasted. The researchers’ goal was to help stakeholders (such as lec The need for this study was to evaluate the student’s view of E-
turers, invigilators, applicants, computer instructors, and server Exams. E-Exams are a good way of assessing the student’s potential in
operators) figure out how to make the process better. It takes into ac knowledge rather than traditional exams. E-Exam is the electronic-based
count of relative convenience of students, managers, and lecturers and exams that the computer system is used to evaluate the capacity and
the dependability and safety of both systems. Challenges in E-Exams had potential of students. This empirical study attempts to analyze the stu
been seen by contrasting the systems in each country. The authors dent perception of the E-Exam system for higher education in KSA
suggest how more successful E-Examination methods might be universities. This is a policy to study the student perception of the E-
developed. Exam system for higher education in KSA universities and its impact on
In 2021, Chirumamilla and Sindre [4] examined the essential char service quality. The proposed methodology was carried out in three
acteristics and processes of these different stakeholders in the E-Exam phases.
platforms. A case study on exploration was carried out drawing on dis
cussions of 12 members of three distinct groups: salespeople at Norway’s • Initially, a questionnaire was prepared with 55 questions under 5
universities, process managers, and systems managers. These groups categories: -Exam implementation; Factors impacting E-Exam envi
agree greatly on the fundamental characteristics of the E-Exam systems, ronment; student’s experience based on the Attempt of E-Exam; E-
but emphasize that not all capabilities sought by end-users were given Exam system challenges; Suggestions for the requirements for E-
priority. There was also much agreement to allow for smoother inte Exam.
gration with the information systems in the higher education sector, and • The prepared questionnaire was distributed among the students as
easier add-ons for specific purposes, with standardization, open in well as research scholars of KSA University. Since all the questions
terfaces, and digital ecosystems - but the ambitions of a flexible were prepared with due care, the students and research scholars of
ecosystem still have to be reached. KSA University were asked to fill all the questions with a higher level
In 2021, Ahmed et al. [5] intended to provide the E-Examination and of accuracy.
E-evaluation experiences of education organizations in different coun • In the third phase, the responses collected from the students and
tries. The report suggests that in the worldwide COVID 2019 pandemic research scholars of KSA university were taken for analysis.
pupils were evaluated using a comprehensive, ongoing assessment,
including an authentication-supported E-Exam. This assisted in identi 3.3. Organized questionnaire
fying and avoiding student infringements. The data reveal that among
many other LMS systems such as the Blackboard and e-Front, Moodle The questionnaire aims to collect exact information which is related
and proprietary solutions were mostly used. Due to the zero costs of such to research. The main objective of the questionnaire is to find the
solutions, the least developed nations choose to employ open-source and acceptance of students for the E-Exam over paper-based exams. In this
proprietary. paper, different questionnaires are prepared for determining the stu
In 2020, Elsalem et al. [6] in this work, cross-sectoral research stu dent’s perception of the E-Exam system. The Questionnaire preparation
dents in Jordan assessed their experience with remote E-Examination is the crucial part as well as the fundamental part of this empirical study.
during the COVID-19 pandemic. The students of the Faculties of Medical To make this study more valuable and meaningful, a questionnaire has
Science (“Medicine, Dentistry, Pharmacy, Nursing, and Applied Medical been prepared with 55 questions, and these questions have been framed
Sciences”) of “Jordan University of Science and Technology” have under 5 different Sections (drivers): E-Exam implementation; Factors
developed Google forms and disseminated their survey to around 29 impacting E-Exam environment; student’s experience based on the
questions. The questions cover demographics for students, stress expe Attempt of E-Exam; E-Exam system challenges; Suggestions for the re
riences and stress causes, and behavioral changes in connection with quirements for E-Exam. Further, 10 questions have been framed under
remote E-Exams. A descriptive, cross-table, and Chi-square test were E-Exam implementation; 12 questions have been framed under Factors
used for the analysis of the responses. impacting E-Exam environment; 13 questions have been framed under
student’s experience based on the Attempt of E-Exam; 13 questions have
3. Analysis of learner’s perspective towards E-Exam been framed under E-Exam system challenges; 7 questions have been
framed under suggestions for the requirements for E-Exam. This
3.1. Proposed hypothesis designed questionnaire has been circulated among the students and
Research scholars, who are pursuing Higher Education at KSA.
To find solutions to the research problems and to obtain its core
point, it is important to make statistical hypotheses that can be proven 3.4. Data collection
and answer the questions of the students about the E-Exam. Here four
hypotheses are performed to evaluate the different responses. In terms of data collection, 101 samples were collected from diverse
The hypothesis formulated for the analysis of the learner’s perspec respondents (Student and Research Scholar), who are pursuing their
tive towards E-Exam is manifested below: higher education in KSA universities (both government as well as private
3
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
4
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
Questions have been prepared under this section, and all these
questions are furnished in Table 2. The responses of the 101 participants
to these questions are given in the form of a pie chart in Fig. 2. For the 1st
question (What is the most important criterion for a ranking-based E-
Exam, in your opinion?), 48% of them selected the user-friendly option,
10% device compatibility, 19% of them selected the question distribu
tion as per the time option, 7% of them have selected the relevant
answering tool existence (Mathematical expression) option and 17% of
them have selected the online technical support option. Then, 72% of
them said that beyond the regular method of learning, E-Exam inspires a
new way of learning. Moreover, for the 3rd question (Do you think that
using an E-Exam service may make the learning process easier?), 81%,
8%, and 11% of them have selected the yes, no, and maybe an option,
respectively. Moreover, for the 5th question (Any platform must be
compatible with an E-Exam?), 69%, 6%, and 28% of them have selected
the yes, no, and may be an option, respectively. While performing the E-
Exam, the system issue was identified to be the major obstacle faced by
the students. Moreover, a majority of 30% of the participants have
neither agreed nor disagreed that online examinations are impractical
due to technical issues. In addition for the 8th question (It is extremely
uncommon for technological difficulties to arise during the E-Exam.),
26%, 8%, 38%, 23%, and 6% of them have selected the agree, strongly
agree, neutral, disagree and strongly disagree option, respectively. 60%,
8%, and 32% of the responders have said that they can raise the tech
nical error issue ticket easily online? are not able to raise the technical
error issue ticket easily online? and neither nor be able to raise the
technical error issue ticket easily online? respectively. 21%, 4%, 65%,
7%, and 3% of the responders have agreed, strongly agree, neutral,
disagree, and strongly disagree option that the technical problems were
resolved right away, respectively. For the question: “Is the resolved so
lution obtained for the E-Exam technical error raised satisfactorily?”, a
majority of 49% have neither agreed nor disagreed with this question.
Moreover, 61%, 10%, 24%, and 5% of them, have selected the option as
agreed, strongly agree, neutral, and disagree with the question lack of
familiarity with technology tools I inhibit using E-Exam?
Table 2
Framed questions under factors impacting E-Exam environment.
Question Framed Questions
No.
5
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
Table 3
Framed questions under student’s experience based on the attempt of E-Exam.
Question Framed Questions
No.
6
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
Table 4
Framed questions under E-Exam system challenges.
Question Framed Questions
No.
7
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
1. Do you think the E-Exam service will make the evaluation process 5.2. Anova analysis under factors impacting E-Exam environment
easier?
2. Do you believe that in an E-Exam, the student’s complete idea can be Tables 6 and 7 express the one-way ANOVA analysis as well as two-
represented?
way ANOVA analysis under the factors impacting E-Exam environment.
3. The volume of electronic exam questions is sufficient for the time
allotted. Here, the sum of squares of purchase intention is 729.79 and the residual
4. Do you find it hard to concentrate on the questions when doing an error is 2004.8. The one-way ANOVA was noteworthy as F (1, 20) =
online exam? 38.2215 and P = 8.6410e-126. The two-way ANOVA is also significant
5. Which of the following do you think might be made better? since, F (1, 20) = 36.7501 and P = 5.6488e-117 for a column. F (1, 19)
6. What did you like most about E-Exam? Why?
= 0.7819 and P = 0.7311 for row and for interaction, F (1, 380) =
7. What did you dislike about E-Exam the most? Why?
0.9224 and P = 0.8362.
8
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
Table 7 Table 13
Two-way ANOVA analysis under factors impacting E-Exam environment. Two-way ANOVA analysis under suggestions for the requirements for E-Exam.
SS df MS F Probability SS df MS F Probability
Columns 716.1 20 35.8051 36.7501 5.6488e-117 Columns 533.98 14 38.1414 44.6970 4.8721e-99
Rows 10.183 19 0.5360 0.6281 0.8874
Rows 14.474 19 0.7618 0.7819 0.7311 Interaction 241.09 266 0.9063 1.0621 0.2569
Interaction 341.52 380 0.8987 0.9224 0.8362 Error 1024 1200 0.8533 {0×0 double} {0×0 double}
Error 1636.8 1680 0.9743 {0×0} {0×0} Total 1809.3 1499 {0×0 double} {0×0 double} {0×0 double}
double} double
Total 2708.9 2099 {0×0 {0×0} {0×0}
double} double} double Absolute indices analysis: Standard Mean Residual Root Square
(SRMR): Generally speaking, the SRMR is ’an absolute fit measure and
has been defined as a standardized distinction between the correlation
Table 8 observed and the anticipated correlation.’ The SRMR is indeed a
One-way ANOVA analysis under student’s experience based on the attempt of E- favorable factor. The absolute fitness of the SRMR is exactly zero, and it
Exam. is said to fit perfectly. In addition, SRMR would not be penalized for
SS df MS F Probability model complexity. The SRMR outcomes are shown in Table 14. In
addition, the threshold value of SRMRs is set at 0.08, which is consid
Columns 914.37 22 41.5620 48.7226 1.0728e-172
Error 1962 2300 0.8530 {0×0 double} {0×0 double ered to fit well below the threshold. However, all the SRMR values above
Total 2876.3 2322 {0×0 double} {0×0 double} {0×0 double the threshold values, and thus say poor fit, as the findings of this
research are recorded.
Measurement model Tucker and Lewis (TLI) and Bentler and
Table 9 Bonett (NFI) relative indices analysis: Table 15 shows the results
Two-way ANOVA analysis under student’s experience based on the attempt of E- acquired by the TLI and NFI outcomes. "The model with ⋅ χ2 of zero and
Exam. the worst model with ⋅ χ2 of zero is designated as the b̈est model." TLI is
SS df MS F Probability
measured according to the average data size. If the average correlation
between variables is not strong, then the TLI is likewise low.
Columns 896.48 22 40.7490 47.3467 2.8762e-161
Measurement model: Construct reliability- it is a criterion of
Rows 20.045 19 1.0550 1.2258 0.2265
Interaction 337.03 418 0.8063 0.9369 0.7965 quality; it must be sustained to demonstrate a greater correlation be
Error 1583.6 1840 0.8607 {0×0} {0×0} tween indicators. The two main metrics for construction dependability
double} double are: "composite and Cronbach alpha." Only if the values of "Cronbach
Total 2837.2 2299 {0×0 {0×0 double} {0×0}
Alpha, Dillon-Goldstein Rho, and Dijkstra-Henseler Rho" are 0.7 or
double} double
higher than that is achieved is the internal reliability of a construction
formed. In this study, all values collected are determined to be higher
than the threshold and the constructive reliability is higher. Cronbach
Table 10 Alpha, Dillon Goldstein and Dijkstra-Henseler Rho are shown in
One-way ANOVA analysis under student’s E-Exam system challenges.
Table 16.
SS df MS F Probability Measurement model: AVE- This is a definition of "the average dif
Columns 986.95 21 46.9978 52.8520 1.6896e-177 ference between a building and its measurements." The AVE threshold is
Error 1956.3 2200 0.8892 {0×0 double} {0×0 double} set at 0.5. However, the AVE value achieved is lower than the criterion.
Total 2943.3 2221 {0×0 double} {0×0 double} {0×0} double The average variability explained in Table 17 is defined.
Structural model: Path Design Matrix- The Path design matrix E-
Exam implementation, factors impacting E-Exam environment, stu
Table 11 dent’s experience based on the attempt of E-Exam, E-Exam system
Two-way ANOVA analysis under student’s E-Exam system challenges. challenges, and suggestions for the requirements for E-Exam is tabulated
SS df MS F Probability in Table 18, respectively.
Path graph: R-Squared- Table 19 exemplifies the R-Squared
Columns 980.28 21 46.6802 55.1390 8.3829e-176
Rows 33.684 19 1.7728 2.0941 0.0038 constraint corresponding to E-Exam implementation, factors impacting
Interaction 413.12 399 1.0354 1.2230 0.0042 E-Exam environment, student’s experience based on the attempt of E-
Error 1490 1760 0.8466 {0×0 double} {0×0 double} Exam, E-Exam system challenges, and suggestions for the requirements
Total 2917.1 2199 {0×0 double} {0×0 double} {0×0 double} for E-Exam.
Table 12 Table 14
One-way ANOVA analysis under suggestions for the requirements for E-Exam. . Analysis of Standardized Root Mean Square Residual (SRMR).
SS df MS F Probability SRMR (threshold SRMR < 0.080)
9
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
Table 16
Analysis of constructive reliability.
Cronbach Alpha(> 0.7000) Dillon-Goldstein rho (> 0.7000) Dijkstra-Henseler rho (> 0.7000)
None.
This research underwent an empirical investigation to examine stu
dents’ attitudes regarding the E-Exam system in higher education in
Saudi Arabian universities. To investigate student perceptions of the E-
Exam system for higher education in Saudi Arabian institutions, as well
as its influence on service quality. The proposed approach was imple
mented in three stages. The developed questionnaire was given to stu Table 19
dents in KSA universities in the second phase. All of the questions in the Analysis of path graph.
questionnaire are made mandatory. As a result, the students are ex Construct R2 R2adj
pected to fill in as much exact information as feasible. The comments E-Exam implementation 0.0000 0.0000
from concerned students affiliated with KSA universities were analyzed Factors impacting E-Exam environment 0.3837 0.3775
in the third phase. The evaluation was carried out in SEM and ANOVA, Student’s experience based on the Attempt of E-Exam 0.2525 0.2449
E-Exam system challenges 0.2891 0.2819
respectively. The viability of the E-Exams paradigm in a supervised
Suggestions for the requirements for E-Exam 0.1202 0.1113
exam room scenario was successfully shown through this seed project.
Table 18
Analysis of Path Design Matrix.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam
E-Exam implementation 0 1 1 1 1
Factors impacting E-Exam 0 0 0 0 0
environment
Student’s experience based on the 0 0 0 0 0
Attempt of E-Exam
E-Exam system challenges 0 0 0 0 0
Suggestions for the requirements 0 0 0 0 0
for E-Exam
10
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
Table 20
Analysis of path coefficients.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam
Table 21
Analysis of indirect effects.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam
E-Exam implementation 0 0 0 0 0
Factors impacting E-Exam 0 0 0 0 0
environment
Student’s experience based on the 0 0 0 0 0
Attempt of E-Exam
e-exam system challenges 0 0 0 0 0
Suggestions for the requirements 0 0 0 0 0
for E-Exam
Table 22
Analysis of total effects.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam
Table 23
Analysis of inter-construct correlation.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam
E-Exam implementation 1 0 0 0 0
Factors impacting E-Exam 0.6194 1 0 0 0
environment
Student’s experience based on the 0.5025 0.6324 1 0 0
Attempt of E-Exam
E-Exam system challenges 0.5377 0.6426 0.7321 1 0
Suggestions for the requirements 0.3468 0.5608 0.6294 0.7228 1
for E-Exam
G. Gokulkumari: Conceptualization, Visualization, Methodology, The authors extend their appreciation to the Deputyship for Research
Writing – original draft. Thamer Al-Hussain: Writing – review & edit & Innovation, Ministry of Education in Saudi Arabia for funding this
ing. Syed Akmal: Writing – review & editing. Prakash Singh: Writing – research work through the project number 7886.
review & editing.
References
Declaration of Competing Interest
[1] Khan MA, Vivek V, Khojah M, Nabi MK, Paul M, Minhaj SM. Learners’ perspective
towards E-Exams during COVID-19 outbreak: evidence from higher educational
The authors declare that they have no conflict of interest. institutions of India and Saudi Arabia. Int J Environ 2021;18(12):6534.
No conflicts of interest. [2] Yousuf Shraim K. Online examination practices in higher education institutions:
learners’ Perspectives. Turk Online J Distance Educ 2019;20(4).
11
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195
[3] Fluck A, Adebayo OS, Abdulhamid SM. Secure E-Examination systems compared: [22] Figueroa-Cañas J, Sancho-Vinuesa T. Early prediction of dropout and final exam
case studies from two countries. J Inf Technol Educ Innov Pract 2018;16:107–25. performance in an online statistics course. IEEE Rev Iberoam Tecnol Aprendiz
http://www.informingscience.org/Publications/3705. 2020;15(2):86–94. https://doi.org/10.1109/RITA.2020.2987727. May.
[4] Chirumamilla A, Sindre G. E-Exams in Norwegian higher education: vendors and [23] Shen J, Hiltz SR, Bieber M. Learning strategies in online collaborative
managers views on requirements in a digital ecosystem perspective. Comput Educ examinations. IEEE Trans Prof Commun 2008;51(1):63–78. https://doi.org/
2021;172:104263. 1. 10.1109/TPC.2007.2000053. March.
[5] Ahmed FRA, Ahmed TE, Abu-Zinadah H. Analysis and challenges of robust E- [24] Trussell HJ, Gumpertz ML. Comparison of the effectiveness of online homework
Exams performance under COVID-19. Results Phys 2021;23:103987. with handwritten homework in electrical and computer engineering classes. IEEE
[6] Elsalem L, Al-Azzam N, Kheirallah KA. Stress and behavioral changes with remote Trans Educ 2020;63(3):209–15. https://doi.org/10.1109/TE.2020.2971198. Aug.
E-Exams during the COVID-19 pandemic: a cross-sectional study among [25] Shen J, Hiltz SR, Bieber M. Collaborative online examinations: impacts on
undergraduates of medical sciences. Ann Med Surg 2020;60:271–9. interaction, learning, and student satisfaction. IEEE Trans Syst Man Cybern Part A
[7] Sukadarmika G, Hartati RS, Linawati, Sastra NP. Introducing TAMEx model for Syst Hum 2006;36(6):1045–53. https://doi.org/10.1109/TSMCA.2006.883180.
availability of E-Exam in a wireless environment. In: Proceedings of the Nov.
international conference on information and communications technology [26] Hernandez J, Rodríguez F, Hilliger I, Pérez-Sanagustín M. MOOCs as a remedial
(ICOIACT); 2018. p. 163–7. https://doi.org/10.1109/ICOIACT.2018.8350741. complement: students’ adoption and learning outcomes. IEEE Trans Learn Technol
[8] Böhmer C, Feldmann N, Ibsen M. E-Exams in engineering education — online 2019;12(1):133–41. https://doi.org/10.1109/TLT.2018.2830373. 1 Jan.-March.
testing of engineering competencies: experiences and lessons learned. In: [27] Gaspar Martins S. Weekly online quizzes to a mathematics course for engineering
Proceedings of the IEEE global engineering education conference (EDUCON); students. Teach Math Appl Int J IMA 2017;36(1):56–63. https://doi.org/10.1093/
2018. p. 571–6. https://doi.org/10.1109/EDUCON.2018.8363281. teamat/hrw011. March.
[9] Sindre G. Code writing vs code completion puzzles: analyzing questions in an E- [28] Xu Z, Yuan H, Liu Q. Student performance prediction based on blended learning.
Exam. In: Proceedings of the IEEE frontiers in education conference (FIE); 2020. IEEE Trans Educ 2021;64(1):66–73. https://doi.org/10.1109/TE.2020.3008751.
p. 1–9. https://doi.org/10.1109/FIE44824.2020.9273919. Feb.
[10] Aksoy A, Ledet JW, Gunay M. Design considerations of a flexible computer-based [29] Llamas-Nistal M, Mikic-Fonte FA, Caeiro-Rodríguez M, Liz-Domínguez M.
assessment system. In: Proceedings of the 4th International conference on Supporting intensive continuous assessment with BeA in a flipped classroom
computer science and engineering (UBMK); 2019. p. 1–6. https://doi.org/ experience. IEEE Access 2019;7:150022–36. https://doi.org/10.1109/
10.1109/UBMK.2019.8907044. ACCESS.2019.2946908.
[11] Aljader HKS. A comprehensive methodology to develop an efficient electronic [30] Luo H, Luo J. Robust online orientation correction for radiographs in PACS
learning management system that is compatible with various applications. In: environments. IEEE Trans Med Imaging 2006;25(10):1370–9. https://doi.org/
Proceedings of the global conference for advancement in technology (GCAT); 10.1109/TMI.2006.880677. Oct.
2019. p. 1–6. https://doi.org/10.1109/GCAT47503.2019.8978334. [31] Kim Y, Jeong S, Ji Y, Lee S, Kwon KH, Jeon JW. Smartphone response system using
[12] Massing T, Schwinning N, Striewe M, Hanck C, Goedicke M. E-assessment using Twitter to enable effective interaction and improve engagement in large
variable-content exercises in mathematical statistics. J Stat Educ 2018;26:174–89. classrooms. IEEE Trans Educ 2015;58(2):98–103. https://doi.org/10.1109/
[13] Chirumamilla A, Sindre G, Nguyen-Duc A. Cheating in E-Exams and paper exams: TE.2014.2329651. May.
the perceptions of engineering students and teachers in Norway. Assess Eval High [32] Tian Y, Huang WN, Zhang L, Wang T. Disclosing personal names in screen names
Educ 2020;45(7):940–57. predict better final achievement levels in massive open online courses. IEEE Access
[14] Nsor-Ambala R. Impact of exam type on exam scores, anxiety, and knowledge 2021;9:50926–38. https://doi.org/10.1109/ACCESS.2021.3069451.
retention in a cost and management accounting course. Account Educ 2020;29: [33] Khan MA, Vivek V, Khojah M, Nabi MK, Paul M, Minhaj SM. Learners’ perspective
32–56. towards E-Exams during COVID-19 outbreak: evidence from higher educational
[15] Keen J, Salvatorelli A. Principles and practice of engineering exam pass rate by institutions of India and Saudi Arabia. Int J Environ Res Public Health 2021;18
gender. Eng Stud 2018;10:158–68. (12):6534. https://doi.org/10.3390/ijerph18126534. Jun 17 PMID: 34204429
[16] Jung IY, Yeom HY. Enhanced security for online exams using group cryptography. PMCID: PMC8296437.
IEEE Trans Educ 2009;52(3):340–9. https://doi.org/10.1109/TE.2008.928909. [34] Binnahedh IA. E-assessment: Wash-Back Effects and challenges (examining
Aug. students’ and teachers’ attitudes towards E-tests). Theory Pract Lang Stud 2022;12
[17] Muzaffar AW, Tahir M, Anwar MW, Chaudry Q, Mir SR, Rasheed Y. A systematic (1):203–11.
review of online exams solutions in E-learning: techniques, tools, and global [35] Or C, Chapman E. Development and acceptance of online assessment in higher
adoption. IEEE Access 2021;9:32689–712. https://doi.org/10.1109/ education: recommendations for further research. J Appl Learn Teach 2022;5(1).
ACCESS.2021.3060192. [36] Adanır GA. Student acceptance of online proctored exams. Design and
[18] Parent DW. Improvements to an electrical engineering skill audit exam to improve measurement strategies for meaningful learning. IGI Global; 2022. p. 212–29.
student mastery of core EE concepts. IEEE Trans Educ 2011;54(2):184–7. https:// [37] Binnahedh IA. E-assessment: Wash-Back effects and challenges (examining
doi.org/10.1109/TE.2010.2042451. May. students’ and teachers’ attitudes towards E-tests). Theory Pract Lang Stud 2022;12
[19] Atoum Y, Chen L, Liu AX, Hsu SDH, Liu X. Automated online exam proctoring. (1):203–11.
IEEE Trans Multimed 2017;19(7):1609–24. https://doi.org/10.1109/ [38] Ahmed FRA, Ahmed TE, Saeed RA, Alhumyani H, Abdel-Khalek S, Abu-Zinadah H.
TMM.2017.2656064. July. Analysis and challenges of robust E-Exams performance under COVID-19. Results
[20] Pardines I, Sanchez-Elez M, Martínez DAC, Gómez JI. Online evaluation Phys 2021;23:103987.
methodology of laboratory sessions in computer science degrees. IEEE Rev Iberoam [39] Shraım K. Online examination practices in higher education institutions: learners’
Tecnol Aprendiz 2014;9(4):122–30. https://doi.org/10.1109/ perspectives. Turk Online J Distance Educ 2019;20(4):185–96. https://doi.org/
RITA.2014.2363003. Nov. 10.17718/tojde.640588.
[21] Balaha HM, Saafan M. Automatic Exam Correction Framework (AECF) for the
MCQs, essays, and equations matching. IEEE Access 2021;9:32368–89. https://doi.
org/10.1109/ACCESS.2021.3060940.
12