You are on page 1of 12

Advances in Engineering Software 173 (2022) 103195

Contents lists available at ScienceDirect

Advances in Engineering Software


journal homepage: www.elsevier.com/locate/advengsoft

Analysis of E-Exam practices in higher education institutions of KSA:


Learners’ perspectives
G. Gokulkumari *, Thamer Al-Hussain , Syed Akmal , Prakash Singh
Department of E-Commerce, College of Administrative and financial sciences, Saudi Electronic University, Saudi Arabia

A R T I C L E I N F O A B S T R A C T

Keywords: Assessment is an important aspect of learning. It is done to find out what each student learned. The adoption of
E-Exam valid assessment techniques will help to improve the quality of the educational process. It will allow students to
Questionnaire preparation determine whether or not they achieved their learning goals. Due to this importance, students are assessed in a
Exam implementation
variety of ways. E-Exams are an example of such a technique. Students in university courses must be examined in
ANOVA analysis
SEM analysis
a short period through mid-examinations and final exams. This necessitates the use of a quick evaluation pro­
cedure to yield rapid results. This research work aims to analyze the student perceptions towards the E-Exam
system in higher education in Saudi Arabian universities. The study of the E-Exam system is primarily focused on
the following subjects: (a) E-Exam implementation (b) E-Exam environment Factors (c) Student Experience Based
on E-Exam Attempts (d) E-Exam System Challenge. (e) Recommendations for E-Exam requirements. The sug­
gested work is divided into three stages: The first step begins with the preparation of a questionnaire that has
explicitly addressed the impact of the aforementioned motivations on learners. The data presented in the
questionnaire were intimately associated with KSA university students. The developed questionnaire was
distributed to students as well as research scholars at KSA universities in the second phase. All of the questions in
the questionnaire are made mandatory to fill. In the third step, the responses of concerned individuals from Saudi
Arabian universities are analyzed. The evaluation has been made with ANOVA and SEM analysis.

In general, evaluation is an important part of the educational pro­


1. Introduction cess. As a result, educators are engaged in developing a promising and
effective method of assessment that incorporates E-learning approaches
In today’s life, technology plays a significant role and is used in in the design and distribution of exams to students all over the world
almost all areas like medical, business, and banks as well [1,2]. It has [31,32]. The technique would have to be credible and transparent [9]; in
also been widely employed in educational institutions, such as schools, fact, this will enable connections among teachers and students while
universities, and community colleges. Many schools use e-curricles, providing a tool for ongoing assessment [15–18]. As a result of this
smart boards, and computer-based learning models. There have many connection, academic attainment may improve including the “learning
factors behind the rising usage of ICTs in academic institutions for outcomes of the course, such as knowledge, cognition, interpersonal
improving educational quality. It will make information intake and skills, responsibility, communication, and information technology
learning easier [3–6]. It will help to improve the efficiency of the process skills” [19–22].
of putting educational policy into practice. E-learning is a successful Multiple-choice, true or false, matching, arranging, fill in the blank,
education technique, capable of transmitting knowledge effectively to essays, and other types of questions make up an E-Exam. It is developed
students. It depends on the use of advanced communication methods, using customized software to detect an individual’s performance in all
including computers, computer networks, multimedia, audio-visual important aspects. There has a distinction between an E-Exam and a
aids, graphics, mobile handheld equipment, and search engines [1, CBA, which depends on the use of special software without an internet
7–10]. E-learning, therefore indicates that the information with effective connection [10]. On the other hand, NBA is focused on the use of
interactions among instructors and learners may be transmitted using all internet techniques including a remote test network that considers the
sorts of technology to provide the most benefits in a short period and range of its coverage [23–26]. The TAMEx model was created to in­
with minimal effort [11–14]. crease the availability of an E-Exam conducted over a wireless network,

* Corresponding author.
E-mail address: g.govindasamy@seu.edu.sa (G. Gokulkumari).

https://doi.org/10.1016/j.advengsoft.2022.103195
Received 21 April 2022; Received in revised form 8 July 2022; Accepted 23 July 2022
Available online 17 August 2022
0965-9978/© 2022 Elsevier Ltd. All rights reserved.
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

and reduces the impact of student mistakes. With this technological


Nomenclature advancement, E-Exam difficulties have considerably diminished. How­
ever still, there are possible inconveniences to the system, such as inter
Abbreviations operativeness, which can jeopardize the reliability of the answers of the
NBA network-based assessment students; the necessity of appropriate facilities; safety testing (cheating
DU Delhi University prevention), backup for unexpected problems, and limitations in the
CBA computer-based assessment technology of the student/user. Many studies show that E-Exams are
JMI Jamia Millia Islamia well-liked by students and have a favorable impact on teacher evalua­
TAMEx time-adaptive for mobile E-Exam tions, as well as increasing the number of students monitored by a fac­
PSE perceptions of students toward E-Exams ulty member.
SEU Saudi Arabia The challenges of E-Exam design: The use of multimedia in E-Exam
NCT national capital territory allows the deployment of new types of questions, as well as a suitable
JNU Jawaharlal Nehru University method to collect feedback, quick support, help during the test, the
POE pre-requisites of E-exams dissemination of results, the easy use of information, and the versatility
of examination schedule. Furthermore, preparing similar forms of
assessment multiple times for a huge count of people/examinees in
various locations and submitting them by email or through websites is
particularly a “WLAN IEEE.802.11′′ [30]. This approach automatically simple [19,30]. Once all the questions are answered, an E-Exam pro­
adds time to compensate for the time lost due to connection disruption vides quick results and can provide a direct study of test performance for
during a specific exam [11]. E-exams ease the process of students’ as­ a group of people/examinees. A question bank can be created to serve as
sessments. They assist both the administrative personnel and students in a selection source for future exams. In terms of effort, time, and money,
reducing the amount of time spent preparing and monitoring exams. it is a low-cost technique [20,14,21].
However, administrators still encounter problems when it comes to The major contribution of this research work is
improving the E-Exam authentication. Our review has two goals: first,
we want to shed light on the experiences and challenges of imple­ • In this research work, an empirical study is undergone to gather
menting an E-Exam system in a variety of nations, both developed and information about the E-Exam practices for higher education in KSA.
developing. Second, we provide fresh research in an attempt to address • A questionnaire has been prepared with 55 questions, under 5 cat­
the electronic exam’s difficulties. This examination method was initially egories: (a) E-Exam implementation (b) E-Exam environment Factors
used at the University of King Abdul Aziz in the Middle East. Faculty (c) Student Experience Based on E-Exam Attempt (d) E-Exam System
training courses on modern trends in examinations, assessment, and Challenge. (e) Recommendations for E-Exam requirements. The
E-Exam preparation, faculty guidance on how to measure and assess prepared questionnaire was distributed among the students as well
students scientifically, proper ways to formulate examinations, and as research scholars of KSA University, and the responses based on
studies of examination programmers prepared by the department their perspectives have been collected.
overseeing distance examinations were conducted by the university • The collected responses were analyzed via SEM, ANOVA, and PL-
[27–29]. King Abdul Aziz University’s experience with an E-Exam has SEM.
resulted in major educational advantages, such as the eradication of
counterfeiting and cheating. Many universities and colleges in Sudan, The rest of this paper is organized as: Section 2 discusses the E-Exam
including the Academy of Medical Sciences and Technology, the Sudan practices conducted in different universities. Section 3 portrays the
University of Science and Technology, and Medicine Colleges have Analysis of collected responses: SEM analysis. Section 4 emphasizes the
begun to use E-Exam. Neelain University has shown to be effective in analysis of collected responses: SEM analysis. The results and discussion
using E-Exams and has reaped the benefits. There, 100,000 students take of the outcomes in terms of ANOVA and PL-SEM are manifested in
the final test and use iPads to answer the questions; the exam is held in Section 5.
closed classrooms. The experiment was effective in reducing cheating,
and the advantages of the E-Exam in terms of electronic correction and 2. Literature review
speed of results distribution are clear, especially when a big number of
students are involved. This experiment has decreased fraud; particularly 2.1. Related works
for a big number of students, the value of the E-Exam seems evident,
when it comes to electric correction and speed of reporting. Now the In 2021, Khan et al. [1] conducted research on 207 enrolled students
Sudanese Ministry of Education seeks to publish a questionnaire to in four universities: DU, JMI, and JNU, located in “NCT of Delhi, India,
prepare the way for the future conduct of E-Exams for final exams. It was and Saudi Electronic University”. The study took a quantitative
found that students accept that quick feedback in E-Exams helps them to approach, where responses were collected using online surveys. The
know the subject well [35,36,39]. It was discovered that E-Exam is an investigation used CFA to see whether the method of online tests has
efficient method for evaluating the student’s knowledge [38]. Many been chosen as the suitable form of evaluation, using AMOS (version 24)
studies show that students choose E-Exams over printed exams because software assuring the security and well-being of learners. Cronbach’s
they can do their exams at any time they prefer to do [33,34,37]. They alpha was employed in the study to determine the reliability of the two
can get their marks immediately and it is an easy process. latent constructs, namely PSE and POE, and the findings demonstrated
Merits and challenges of E-Exams: Fortunately, with COVID 2019 that all of the measured variables had a higher degree of internal con­
becoming worldwide in early 2020, many of the emerging innovations sistency. In addition, the researcher employed the mean and standard
have constantly served the improvement of the educational process. deviation for establishing the online test system’s prerequisites. Peda­
Technology is primarily employed for evaluation reasons. Fortunately, gogy, confirmability, emotional aspects, manageability, and confiden­
in the twenty-first century, the E-Exam has profited as the test-takers are tiality were the major considerations in their decision.
accustomed to seeing and using new technology. Their frequent use of In 2019, Shraim [2] looked at how students at Palestine Technical
cell phones and other electronic devices makes it easier to complete such University-Kadoorie perceive online test methods. An opinion poll of
tests. The technology also saves cost for complicated aspects of the 342 undergraduates revealed their opinions mostly on the benefits of
testing cycle, immediate scores, assurance on precise marking than online exams concerning pedagogy, authenticity, trustworthiness,
human marking, prevents hand-writing and student identification bias, emotional aspects, practicality, and protection. The findings indicate

2
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

that online tests were regarded to offer substantial advantages over H0 ◊There is no significant difference between the perceptions of
conventional paper-based exams, such as grading consistency in terms of students’ interest in E-Exam.
duration, energy, and expenses incurred mostly in the exam process. H A◊ The factors like security, privacy, and confidentiality have a
Participants, on the other hand, highlighted several difficulties with the significant effect on E-Exam.
effective deployment of online tests, including concerns of confidenti­ H0 ◊There is no significance between the number of attempts and
ality, authenticity, and fairness. The data further suggested that E-Exams experience with student satisfaction.
were better suited to summative assessment, which was used to measure H A ◊ The challenges of E-Exam are significant effects with the E-
learning rather than summative evaluation. Online examinations must Exam system.
always be designed to be legitimate, dependable, trustworthy, and
adaptable to be implemented successfully. Compare to paper-based examinations E-Exams are effective and
In 2018, Fluck et al. [3] used a case study analysis; at “FUT Minna reduce the workload of the faculty members. The above four hypotheses
Nigeria with one at the University of Tasmania in Australia” on E-Exam. are performed and each gives efficient results.
Considering various approaches to question kinds, cohort size, tech­
nology employed, and security features, the responsibilities supported 3.2. Methodology
or impeded by each of the two E-Examination systems were also con­
trasted. The researchers’ goal was to help stakeholders (such as lec­ The need for this study was to evaluate the student’s view of E-
turers, invigilators, applicants, computer instructors, and server Exams. E-Exams are a good way of assessing the student’s potential in
operators) figure out how to make the process better. It takes into ac­ knowledge rather than traditional exams. E-Exam is the electronic-based
count of relative convenience of students, managers, and lecturers and exams that the computer system is used to evaluate the capacity and
the dependability and safety of both systems. Challenges in E-Exams had potential of students. This empirical study attempts to analyze the stu­
been seen by contrasting the systems in each country. The authors dent perception of the E-Exam system for higher education in KSA
suggest how more successful E-Examination methods might be universities. This is a policy to study the student perception of the E-
developed. Exam system for higher education in KSA universities and its impact on
In 2021, Chirumamilla and Sindre [4] examined the essential char­ service quality. The proposed methodology was carried out in three
acteristics and processes of these different stakeholders in the E-Exam phases.
platforms. A case study on exploration was carried out drawing on dis­
cussions of 12 members of three distinct groups: salespeople at Norway’s • Initially, a questionnaire was prepared with 55 questions under 5
universities, process managers, and systems managers. These groups categories: -Exam implementation; Factors impacting E-Exam envi­
agree greatly on the fundamental characteristics of the E-Exam systems, ronment; student’s experience based on the Attempt of E-Exam; E-
but emphasize that not all capabilities sought by end-users were given Exam system challenges; Suggestions for the requirements for E-
priority. There was also much agreement to allow for smoother inte­ Exam.
gration with the information systems in the higher education sector, and • The prepared questionnaire was distributed among the students as
easier add-ons for specific purposes, with standardization, open in­ well as research scholars of KSA University. Since all the questions
terfaces, and digital ecosystems - but the ambitions of a flexible were prepared with due care, the students and research scholars of
ecosystem still have to be reached. KSA University were asked to fill all the questions with a higher level
In 2021, Ahmed et al. [5] intended to provide the E-Examination and of accuracy.
E-evaluation experiences of education organizations in different coun­ • In the third phase, the responses collected from the students and
tries. The report suggests that in the worldwide COVID 2019 pandemic research scholars of KSA university were taken for analysis.
pupils were evaluated using a comprehensive, ongoing assessment,
including an authentication-supported E-Exam. This assisted in identi­ 3.3. Organized questionnaire
fying and avoiding student infringements. The data reveal that among
many other LMS systems such as the Blackboard and e-Front, Moodle The questionnaire aims to collect exact information which is related
and proprietary solutions were mostly used. Due to the zero costs of such to research. The main objective of the questionnaire is to find the
solutions, the least developed nations choose to employ open-source and acceptance of students for the E-Exam over paper-based exams. In this
proprietary. paper, different questionnaires are prepared for determining the stu­
In 2020, Elsalem et al. [6] in this work, cross-sectoral research stu­ dent’s perception of the E-Exam system. The Questionnaire preparation
dents in Jordan assessed their experience with remote E-Examination is the crucial part as well as the fundamental part of this empirical study.
during the COVID-19 pandemic. The students of the Faculties of Medical To make this study more valuable and meaningful, a questionnaire has
Science (“Medicine, Dentistry, Pharmacy, Nursing, and Applied Medical been prepared with 55 questions, and these questions have been framed
Sciences”) of “Jordan University of Science and Technology” have under 5 different Sections (drivers): E-Exam implementation; Factors
developed Google forms and disseminated their survey to around 29 impacting E-Exam environment; student’s experience based on the
questions. The questions cover demographics for students, stress expe­ Attempt of E-Exam; E-Exam system challenges; Suggestions for the re­
riences and stress causes, and behavioral changes in connection with quirements for E-Exam. Further, 10 questions have been framed under
remote E-Exams. A descriptive, cross-table, and Chi-square test were E-Exam implementation; 12 questions have been framed under Factors
used for the analysis of the responses. impacting E-Exam environment; 13 questions have been framed under
student’s experience based on the Attempt of E-Exam; 13 questions have
3. Analysis of learner’s perspective towards E-Exam been framed under E-Exam system challenges; 7 questions have been
framed under suggestions for the requirements for E-Exam. This
3.1. Proposed hypothesis designed questionnaire has been circulated among the students and
Research scholars, who are pursuing Higher Education at KSA.
To find solutions to the research problems and to obtain its core
point, it is important to make statistical hypotheses that can be proven 3.4. Data collection
and answer the questions of the students about the E-Exam. Here four
hypotheses are performed to evaluate the different responses. In terms of data collection, 101 samples were collected from diverse
The hypothesis formulated for the analysis of the learner’s perspec­ respondents (Student and Research Scholar), who are pursuing their
tive towards E-Exam is manifested below: higher education in KSA universities (both government as well as private

3
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

universities). The different types of questionnaires are shared with


various university students’ by Google forms via various social me­
diums. The responses have been collected using the Google form. Re­
spondents in the category of female, male, and other belonging to the
age of 20–25, 25–30, 30–35, 40–45, and above 45 and residing in semi-
urban, rural, or urban places were used for data collection. Different age
groups and different university students are given their responses to E-
Exam via Google form. These questionnaires developed on a four-point
basis vary from strongly agree to strongly disagree. In addition, these
respondents have been using different devices for E-Examination
(Laptop, Desktop, Tablet, Smartphone, Both Smartphone and laptop),
and have used different means of communication (Posting on the uni­
versity website, Text message, e-mail, WhatsApp, Any two-mode). All
these responders were requested to put forward their perspectives
accurately, and they were asked to fill out all the questions. Accurately
providing their perceptions gives a more reliable and efficient conclu­
sion. This evident result gives a clear-cut conclusion to this research. The
above-mentioned hypothesis was simulated in MATLAB, and the
experimental investigation was carried out via ANOVA and SEM anal­
ysis, which is a way to find out if survey or experiment results are
significant.

4. Analysis of collected responses: SEM analysis

4.1. Analysis of responses collected under E-Exam implementation

Under this section, we’ve framed 10 questions, and have collected


the responder’s perspectives on these questions. The framed questions
are manifested in Table 1, and the collected responses are given in the
form of a pie chart in Fig. 1. Among the 101 responders, 61% of par­
ticipants said that the E-Exam is applicable for both UG as well as PG
students. Then, on the participants’ perspective toward the question: Do
you think online assessments may help you study more effectively?
seems to be different; as 60%, 23%, 15% and 2% of them have
responded as agree, strongly agree, neutral, and strongly disagree,
respectively. As per the participants’ perspective, the online exam is
appropriate for all the departments, as 70% of them have selected this
option. Then, based on the assessment modes that the participants have
undergone for their online exams, they have selected their preferred
mode as online within the university campus by 20%, By online at your
location by 70% (majority), offline paper-based inside the college
campus (by 6%) and by online video mode oral answering (by 4%).
Then, for the question, do you believe it is a good idea to give students a
chance to practice with a model assessment before they take the E-
Exam?; 56%, 29%, 15% of them responded as agree, strongly agree, and
neutral, respectively. Among the different question formats that have
been utilized for answering the responses, MCQ type has been preferred
by both of the students (71%), while only 2|% have selected the Audio/

Table 1 Fig. 1. Analysis of responses collected toward E-Exam implementation.


Framed questions under responses collected toward E-Exam implementation.
Question Framed Questions video/ visual questions pattern. In addition, easy questions are given to
No. test the knowledge of the students in-depth. For responding to the essay
1. Which higher education level requires an E-Exam? questions, 76% of the students preferred to type their answers, which
2. Do you think online assessments may help you study more might seem to be the easiest as well as the simplest form for answering
effectively? the easy questions legibly. For assessing the student’s knowledge, and to
3. The use of an online test is appropriate for understand their level of understanding different forms of examinations
4. Which mode do you prefer for attending online Exams?
5. Do you believe it is a good idea to give students a chance to practice
have been followed. In this questionnaire, we’ve put forward a question
with a model assessment before they take the E-Exam? to the participants to select their favorite pattern for examination. The
6. Choose the question format you want for your online test. participant’s responses portray that 29% of them prefer open book
7. Choose the most appropriate approach for answering an essay exams, 29% of them prefer open web –online; 27% of them prefer pre-
question.
prepared before the exam and 22% of them prefer any mode. Since it
8. Select the type of exam that you prefer for answering questions.
9. Which of the below would prefer for your online exams? is essential to test the knowledge of the students; the online exams can be
10. Immediate feedback in online exams helps learners to get a deeper conducted at different periods. The participants of our survey have
understanding of the subject. suggested that all the above (Assignment, Midterm only, Final Exam

4
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

only) could be applicable, as 67% have selected this option. In addition,


to the question, Immediate feedback in online exams helps learners to
get a deeper understanding of the subject?; 59%, 21%, 17%, and 3% of
them have responded as agree, strongly agree, neutral, and disagree,
respectively.

4.2. Analysis of responses collected under factors impacting E-Exam


environment

Questions have been prepared under this section, and all these
questions are furnished in Table 2. The responses of the 101 participants
to these questions are given in the form of a pie chart in Fig. 2. For the 1st
question (What is the most important criterion for a ranking-based E-
Exam, in your opinion?), 48% of them selected the user-friendly option,
10% device compatibility, 19% of them selected the question distribu­
tion as per the time option, 7% of them have selected the relevant
answering tool existence (Mathematical expression) option and 17% of
them have selected the online technical support option. Then, 72% of
them said that beyond the regular method of learning, E-Exam inspires a
new way of learning. Moreover, for the 3rd question (Do you think that
using an E-Exam service may make the learning process easier?), 81%,
8%, and 11% of them have selected the yes, no, and maybe an option,
respectively. Moreover, for the 5th question (Any platform must be
compatible with an E-Exam?), 69%, 6%, and 28% of them have selected
the yes, no, and may be an option, respectively. While performing the E-
Exam, the system issue was identified to be the major obstacle faced by
the students. Moreover, a majority of 30% of the participants have
neither agreed nor disagreed that online examinations are impractical
due to technical issues. In addition for the 8th question (It is extremely
uncommon for technological difficulties to arise during the E-Exam.),
26%, 8%, 38%, 23%, and 6% of them have selected the agree, strongly
agree, neutral, disagree and strongly disagree option, respectively. 60%,
8%, and 32% of the responders have said that they can raise the tech­
nical error issue ticket easily online? are not able to raise the technical
error issue ticket easily online? and neither nor be able to raise the
technical error issue ticket easily online? respectively. 21%, 4%, 65%,
7%, and 3% of the responders have agreed, strongly agree, neutral,
disagree, and strongly disagree option that the technical problems were
resolved right away, respectively. For the question: “Is the resolved so­
lution obtained for the E-Exam technical error raised satisfactorily?”, a
majority of 49% have neither agreed nor disagreed with this question.
Moreover, 61%, 10%, 24%, and 5% of them, have selected the option as
agreed, strongly agree, neutral, and disagree with the question lack of
familiarity with technology tools I inhibit using E-Exam?

Table 2
Framed questions under factors impacting E-Exam environment.
Question Framed Questions
No.

1. What is the most important criterion for a ranking-based E-Exam, in


your opinion?
2. Beyond the regular method of learning, does E-Exam inspires a new
way of learning?
3. Do you think that using an E-Exam service may make the learning
process easier?
4. Which E-Exam browsers do you presently use?
5. Any platform must be compatible with an E-Exam?
6. The biggest obstacles faced during the exam are:
7. Do you believe that online examinations are impractical due to
technical issues?
Fig. 2. Analysis of responses collected towards factors impacting E-Exam
8. It is extremely uncommon for technological difficulties to arise during
the E-Exam. environment.
9. Can you able to raise the technical error issue ticket easily online?
10. Are the technical problems resolved right away?
11. Is the resolved solution obtained for the E-Exam technical error raised
satisfactorily?
12. Lack of familiarity with technology tools inhibits using E-Exam?

5
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

4.3. Analysis of responses collected under student’s experience based on


the attempt of E-Exam

The questions prepared under this category are furnished in Table 3.


The responses collected for these questions are shown in Fig. 3. on
analyzing the responses collected for 1st question, 83% (majority) of
them have said that they feel more comfortable attempting an assess­
ment on a computer screen than on a traditional paper-based. Then, 54,
10, 26, and 9% of them agree, strongly agree, neutral, and disagree
believe that the exam schedule was clear and well-organized. For the 3rd
question (Were the instructions for the E-Exam clear and easy to un­
derstand?), 80% of the said “yes” to the instructions for the E-Exam clear
and easy to understand. Moreover, 89%, a majority of the participants
have selected the true/ false pattern as a comfortable for answering the
questions. Moreover, what part of the E-Exam did you like the most?;
10% of them have selected customization and personalization, 6% of
them have selected Fair grading, 19% of them have selected automatic/
fast grading, 47% of them have selected the attempt in a preferable
location, 9% of them have selected the easy and flexible answering and
10% of them have selected the all the above options. Then, for the
question: “Is there a difference between men and women, when it comes
to adopting and utilizing E-Exam?”, most of the participants (63%) said
that there is strongly disagreed that there is no difference between men
and women when it comes to adopting and utilizing E-Exam. In addition,
59%, 17%, 20%, and 4% of the participants have selected the option as
agree, strongly agree, neutral, and disagree respectively for the ques­
tion: “E-Exams are more efficient in terms of time, effort and money
spent”. In addition, 72% of the participants have said that keeping track
of their previous exam outcomes will help to comprehend their progress.
Then, 23, 25, 27, 18, and 8% of them selected the option as agree,
strongly agree, neutral, disagree, and strongly disagree option for the
question: “Do you believe that online examinations pose major health
and safety risks?”. Moreover, 31%, 22%, 40%, 2%, and 6% of them
selected the option as agree, strongly agree, neutral, disagree, and
strongly disagree option for the question: “Anxiety and tension are not
increased by taking an E-Exam”. Moreover, 47% of them have said that
E-Exam enables your faculty members to identify the student’s problems
and weaknesses, while 45% of the suggested the neither-nor option.
Moreover, for the question: “ In overall, how did you feel about the
experience of the E-Exam?”, 54% of them have selected excellent, 33%
of them have selected good, 10% of them have selected fair and 3% of
them have selected poorly.

Table 3
Framed questions under student’s experience based on the attempt of E-Exam.
Question Framed Questions
No.

1. Do you feel more comfortable attempting an Assessment on a


computer screen than on a Traditional Paper-based?
2. Do you think the exam schedule was clear and well-organized?
3. Were the instructions for the E-Exam clear and easy to understand?
4. Which type of question pattern is more comfortable to answer?
5. What part of the E-Exam did you like the most?
6. Is there a difference between men and women, when it comes to
adopting and utilizing E-Exam?
7. E-Exams are more efficient in terms of time, effort and money spent. Fig. 3. Analysis of responses collected towards student’s experience based on
8. Is the left time reminder assisting you in finishing the exam on time?
the attempt of E-Exam.
9. Do you believe that keeping track of previous exam outcomes helps
you comprehend your progress?
10. Do you believe that online examinations pose major health and safety 4.4. Analysis of responses collected under E-Exam system challenges
risks?
11. Anxiety and tension are not increased by taking an E-Exam.
The questions framed under the category of e-exam system chal­
12. Do you think that E-Exam enables your faculty members to identify
the student’s problems and weaknesses? lenges are depicted in Table 4. The responses collected for these ques­
13. Overall, how did you feel about the Experience of the E-Exam? tions are shown in Fig. 4. 835 of the responders have suggested that

6
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

Table 4
Framed questions under E-Exam system challenges.
Question Framed Questions
No.

1. Marking online exams automatically is more accurate than paper-


based marking.
2. Do you believe that online tests allow for more customizable learning
than paper-based exams?
3. Which sort of question pattern do you think is more appropriate for
your course evaluation? (If necessary, select more than one)
4. I prefer typing the answers rather than handwriting (Paper Based) for
essay answers
5. It is easier to cheat on online exams than with paper-based exams.
6. To avoid malpractice, online proctoring at the time of the exam is
required.
7. Based on your attempt, the level of security in E-Exam is reliable?
8. Online exams facilitate more authentic assessment than traditional
methods through the integration of multimedia, simulations, etc.
9. Do you think it is appropriate to apply facial authentication to
academic tests?
10. Do you believe your privacy would be jeopardized if the learning tool
analyzed images of you while you were doing your course
Assessments?
11. What do you think about the warning message alerts during the
examination?
12. In AI proctoring, the violation warning system is relevant in terms of
physical gestures.
13. Do you believe that the use of E-Exam is in line with current technical
advancements?

marking online exams automatically is more accurate than paper-based


marking. For the 2nd question: “Do you believe that online tests allow
for more customizable learning than paper-based exams?”, 59%, 21%,
17%, and 3% of them have selected the option as agree, strongly agree,
neutral, and disagree, respectively. Moreover, 45%, 16%, 30%, 8%, and
2% of them selected the option as agree, strongly agree, neutral,
disagree, and strongly disagree for the question: “I prefer typing the
answers rather than handwriting (Paper Based) for essay answers”.
Moreover, just 4% of them have said that it is easier to cheat in online
exams than in paper-based exams. But, 36% have said that cheating
cannot be done in online exams. Moreover, for the question: “To avoid
malpractice, the online proctoring at the time of exam is required”, 44%,
3%, 39%, 2%, and 13% of them have selected the option as agree,
strongly agree, neutral, disagree and strongly disagree, respectively.
Interestingly, 76% (majority) of them have said that the level of security
in E-Exam is reliable. In addition, 42, 4, 44, 3, and 8% of them selected
the option as agree, strongly agree, neutral, disagree, and strongly
disagree option for the question: “Online exams facilitate more
authentic assessment than traditional methods through the integration
of multimedia, simulations, etc.”. Moreover, 66% of the participants
have said that facial authentication is required while undertaking the
online examination. Surprisingly, 43% of them believed that privacy
may be jeopardized if the learning tool analyzed images of you while
you were doing your course Assessments. For the question: “What do
you think about the warning message alerts during the examination?”,
18% of them have said, it is to provide alertness, 44% 18% of them have
said, it is to cause distortion, 33% 18% of them have said as misappro­
priate warning, and 65% 18% of them have said as mandatory for
assessment. 16, 3, 50, 17, and 15% have selected the agree, strongly Fig. 4. Analysis of responses collected toward E-Exam system challenges.
agree, neutral, disagree, and strongly disagree options for the question:
“In AI proctoring, the violation warning system is relevant in terms of
in Fig. 5. Here, 81% of the participants have said that an E-Exam will
physical gestures”. Moreover, 64% of them strongly believed that the
make the evaluation process much easier. In addition, 74% of them
use of E-Exam is in line with current technical advancements?
strongly believe that in an E-Exam, the student’s complete idea can be
represented?. Moreover, 32%, 2%, 40%, 7%, and 20% of them have
4.5. Analysis of responses collected under suggestions for the requirements selected the option as agree, strongly agree, neutral, disagree, and
for E-Exam strongly disagree option for the question:” The volume of electronic
exam questions is sufficient for the time allotted”. Interestingly, the
The questions developed under this category are manifested in major of the participants have not found it complex to concentrate on a
Table 5, and the collected responses from 101 participants are illustrated question while undertaking an E-Exam. The participants suggest that the

7
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

Table 5 environment; student’s experience based on the attempt of E-Exam; E-


Framed questions under suggestions for the requirements for E-Exam. Exam system challenges; suggestions for the requirements for E-Exam.
Question Framed Questions This evaluation has been made based on the developed hypothesis.
No.

1. Do you think the E-Exam service will make the evaluation process 5.2. Anova analysis under factors impacting E-Exam environment
easier?
2. Do you believe that in an E-Exam, the student’s complete idea can be Tables 6 and 7 express the one-way ANOVA analysis as well as two-
represented?
way ANOVA analysis under the factors impacting E-Exam environment.
3. The volume of electronic exam questions is sufficient for the time
allotted. Here, the sum of squares of purchase intention is 729.79 and the residual
4. Do you find it hard to concentrate on the questions when doing an error is 2004.8. The one-way ANOVA was noteworthy as F (1, 20) =
online exam? 38.2215 and P = 8.6410e-126. The two-way ANOVA is also significant
5. Which of the following do you think might be made better? since, F (1, 20) = 36.7501 and P = 5.6488e-117 for a column. F (1, 19)
6. What did you like most about E-Exam? Why?
= 0.7819 and P = 0.7311 for row and for interaction, F (1, 380) =
7. What did you dislike about E-Exam the most? Why?
0.9224 and P = 0.8362.

5.3. Anova analysis under student’s experience based on the attempt of E-


Exam

Tables 8 and 9 express the one-way ANOVA analysis as well as two-


way ANOVA analysis under the student’s experience based on the
Attempt of E-Exam. Here, the sum of squares of purchase intention is
914.37 and the residual error is 1962. The one-way ANOVA was note­
worthy as F (1, 22) = 48.7226 and P = 1.0728e-172. The two-way
ANOVA is also significant since, F (1, 22) = 47.3467 and P = 2.8762e-
161 for a column. F (1, 19) = 1.2258 and P = 0.2265 for row and for
Interaction, F (1, 418) = 0.9369 and P = 0.7965.

5.4. Anova analysis under E-Exam system challenges

Tables 10 and 11 expresses the one-way ANOVA analysis as well as


two-way ANOVA analysis under the E-Exam system challenges. Here,
the sum of squares of purchase intention is 986.95 and the residual error
is 1956.3. The one-way ANOVA was noteworthy as F (1, 21) = 52.8520
and P = 1.6896e-177. The two-way ANOVA is also significant since, F
(1, 21) = 55.1390 and P = 8.3829e-176 for a column. F (1, 19) = 2.0941
and P = 0.0038 for row and for interaction, F (1, 399) = 1.2230 and P =
0.0042.

5.5. Anova analysis under suggestions for requirements of E-Exam

Tables 12 and 13 expresses the one-way ANOVA analysis as well as


two-way ANOVA analysis under the suggestions for requirements of E-
Exam. Here, the sum of squares of purchase intention is 541.24 and the
residual error is 1287.6. The one-way ANOVA was noteworthy as F (1,
17) = 45.0355 and P = 8.8802e-104. The two-way ANOVA is also sig­
nificant since, F (1, 17) = 44.6970 and P = 4.8721e-99 for a column. F
(1, 19) = 0.6281 and P = 0.8874 for row and for Interaction, F (1, 266)
= 1.0621 and P = 0.2569.

5.6. Simulation procedure-PLSEM analysis

PLS-SEM is a multi-component analytical technique that aims to


evaluate difficult correspondence among many variables. PLS-SEM is a
Fig. 5. Analysis of responses collected towards suggestions for the re­ second-generation multicomponent analytical approach that can be
quirements for E-Exam. executed in marketing analysis. It is the best way to evaluate complex
connections concurrently and it is familiar for its potential to make
guesses. PLS-SEM is repeatedly noted for its potential to make a solution
user interface needs to be made much better.
with small-size samples. Because of the PLS algorithm’s capability to
5. Research and discussion
Table 6
5.1. Simulation procedure One-way ANOVA analysis under factors impacting E-Exam environment.
SS df MS F Probability
The collected responses from 101 participants were analyzed using Columns 729.79 20 36.4894 38.2215 8.6410e-126
one-way analysis as well as two-way analysis in the MATLAB platform. Error 2004.8 2100 0.9547 {0×0} double} {0×0 double}
The evaluation has been made with 5 drivers: factors impacting E-Exam Total 2734.6 2120 {0×0 double {0×0} double} {0×0 double}

8
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

Table 7 Table 13
Two-way ANOVA analysis under factors impacting E-Exam environment. Two-way ANOVA analysis under suggestions for the requirements for E-Exam.
SS df MS F Probability SS df MS F Probability

Columns 716.1 20 35.8051 36.7501 5.6488e-117 Columns 533.98 14 38.1414 44.6970 4.8721e-99
Rows 10.183 19 0.5360 0.6281 0.8874
Rows 14.474 19 0.7618 0.7819 0.7311 Interaction 241.09 266 0.9063 1.0621 0.2569
Interaction 341.52 380 0.8987 0.9224 0.8362 Error 1024 1200 0.8533 {0×0 double} {0×0 double}
Error 1636.8 1680 0.9743 {0×0} {0×0} Total 1809.3 1499 {0×0 double} {0×0 double} {0×0 double}
double} double
Total 2708.9 2099 {0×0 {0×0} {0×0}
double} double} double Absolute indices analysis: Standard Mean Residual Root Square
(SRMR): Generally speaking, the SRMR is ’an absolute fit measure and
has been defined as a standardized distinction between the correlation
Table 8 observed and the anticipated correlation.’ The SRMR is indeed a
One-way ANOVA analysis under student’s experience based on the attempt of E- favorable factor. The absolute fitness of the SRMR is exactly zero, and it
Exam. is said to fit perfectly. In addition, SRMR would not be penalized for
SS df MS F Probability model complexity. The SRMR outcomes are shown in Table 14. In
addition, the threshold value of SRMRs is set at 0.08, which is consid­
Columns 914.37 22 41.5620 48.7226 1.0728e-172
Error 1962 2300 0.8530 {0×0 double} {0×0 double ered to fit well below the threshold. However, all the SRMR values above
Total 2876.3 2322 {0×0 double} {0×0 double} {0×0 double the threshold values, and thus say poor fit, as the findings of this
research are recorded.
Measurement model Tucker and Lewis (TLI) and Bentler and
Table 9 Bonett (NFI) relative indices analysis: Table 15 shows the results
Two-way ANOVA analysis under student’s experience based on the attempt of E- acquired by the TLI and NFI outcomes. "The model with ⋅ χ2 of zero and
Exam. the worst model with ⋅ χ2 of zero is designated as the b̈est model." TLI is
SS df MS F Probability
measured according to the average data size. If the average correlation
between variables is not strong, then the TLI is likewise low.
Columns 896.48 22 40.7490 47.3467 2.8762e-161
Measurement model: Construct reliability- it is a criterion of
Rows 20.045 19 1.0550 1.2258 0.2265
Interaction 337.03 418 0.8063 0.9369 0.7965 quality; it must be sustained to demonstrate a greater correlation be­
Error 1583.6 1840 0.8607 {0×0} {0×0} tween indicators. The two main metrics for construction dependability
double} double are: "composite and Cronbach alpha." Only if the values of "Cronbach
Total 2837.2 2299 {0×0 {0×0 double} {0×0}
Alpha, Dillon-Goldstein Rho, and Dijkstra-Henseler Rho" are 0.7 or
double} double
higher than that is achieved is the internal reliability of a construction
formed. In this study, all values collected are determined to be higher
than the threshold and the constructive reliability is higher. Cronbach
Table 10 Alpha, Dillon Goldstein and Dijkstra-Henseler Rho are shown in
One-way ANOVA analysis under student’s E-Exam system challenges.
Table 16.
SS df MS F Probability Measurement model: AVE- This is a definition of "the average dif­
Columns 986.95 21 46.9978 52.8520 1.6896e-177 ference between a building and its measurements." The AVE threshold is
Error 1956.3 2200 0.8892 {0×0 double} {0×0 double} set at 0.5. However, the AVE value achieved is lower than the criterion.
Total 2943.3 2221 {0×0 double} {0×0 double} {0×0} double The average variability explained in Table 17 is defined.
Structural model: Path Design Matrix- The Path design matrix E-
Exam implementation, factors impacting E-Exam environment, stu­
Table 11 dent’s experience based on the attempt of E-Exam, E-Exam system
Two-way ANOVA analysis under student’s E-Exam system challenges. challenges, and suggestions for the requirements for E-Exam is tabulated
SS df MS F Probability in Table 18, respectively.
Path graph: R-Squared- Table 19 exemplifies the R-Squared
Columns 980.28 21 46.6802 55.1390 8.3829e-176
Rows 33.684 19 1.7728 2.0941 0.0038 constraint corresponding to E-Exam implementation, factors impacting
Interaction 413.12 399 1.0354 1.2230 0.0042 E-Exam environment, student’s experience based on the attempt of E-
Error 1490 1760 0.8466 {0×0 double} {0×0 double} Exam, E-Exam system challenges, and suggestions for the requirements
Total 2917.1 2199 {0×0 double} {0×0 double} {0×0 double} for E-Exam.

Table 12 Table 14
One-way ANOVA analysis under suggestions for the requirements for E-Exam. . Analysis of Standardized Root Mean Square Residual (SRMR).
SS df MS F Probability SRMR (threshold SRMR < 0.080)

Columns 541.24 14 38.6598 45.0355 8.8802e-104 Baseline Model: 0.2162


Error 1287.6 1500 0.8584 {0×0 double} {0×0 double} Composite Model 0.1306
Total 1828.9 1514 {0×0 double} {0×0 double} {0×0 double} Factor Mode 0.1143

create a result with smaller samples, researchers have affirmed that


Table 15
studies with smaller samples are asserted when PLS-SEM is the analytic
Analysis of TLI and NFI.
approach of choice. In general, the PLS-SEM is a structural equation
TLI NFI
modeling, which aids in the estimation of the complicated cause-effect
correlation models having latent variables. The collected responses Composite Model 0.2218 0.1657
were analyzed using PL-SEM or PLS-SEM. Factor Model 0.4158 0.3104

9
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

Table 16
Analysis of constructive reliability.
Cronbach Alpha(> 0.7000) Dillon-Goldstein rho (> 0.7000) Dijkstra-Henseler rho (> 0.7000)

E-Exam implementation 0.3010 0.5052 0.5114


Factors impacting E-Exam environment 0.6283 0.7190 0.7029
Student’s experience based on the Attempt of E-Exam 0.8094 0.8463 0.8441
E-Exam system challenges 0.6131 0.7039 0.7916
Suggestions for the requirements for E-Exam 0.3049 0.6038 0.7492

The experiments should be expanded across the higher education sector,


Table 17 according to our suggestion. The technique has proven to be generally
Analysis of AVE. applicable in trials conducted so far in a variety of disciplines. However,
Construct AVE(AVE > 0.5000) the application of the E-Exam method throughout all disciplines has not
yet been investigated, and therefore more research can be conducted
E-Exam implementation 0.1909
Factors impacting E-Exam environment 0.2477
across numerous higher education institutions in a larger variety of
Student’s experience based on the Attempt of E-Exam 0.3286 academic fields. This will help to improve and develop effective stra­
E-Exam system challenges 0.2681 tegies and technology even further. Hardware to software, technical
Suggestions for the requirements for E-Exam procedures, exam room procedures, policy, workflow design, user
manuals, systems documentation, investigation of usage in many disci­
plines, and training for academics, students, IT, and examinations em­
Path coefficients: Table 20 portrays the path coefficients of E-Exam
ployees are all areas that require further attention. While the E-Exam
implementation, factors impacting the E-Exam environment, student’s
system (user) is operational as a prototype for E-Exam trials, still it needs
experience based on the attempt of E-Exam, E-Exam system challenges,
to be tweaked. Also, E-Exams contain considerable limitations. The
and suggestions for the requirements for E-Exam
possibility of technical errors is common so interpretation can occur
Indirect effects: Table 21 portrays the indirect effects of E-Exam
during exams. Understanding and accepting the latest technology is a bit
implementation, factors impacting the E-Exam environment, student’s
difficult for students as well as faculty. Reducing and eliminating these
experience based on the attempt of E-Exam, E-Exam system challenges,
limitations will provide a better output for the E-Exam system.
and suggestions for the requirements for E-Exam.
Total effects: Table 22 portrays the total effects of E-Exam imple­
Ethical statement
mentation, factors impacting the E-Exam environment, student’s expe­
rience based on the attempt of E-Exam, E-Exam system challenges, and
This paper does not contain any studies with human participants or
suggestions for the requirements for E-Exam
animals performed by any of the authors.
Inter-construct correlations: Table 23 portrays the Inter-Construct
Correlations of E-Exam implementation, factors impacting the E-Exam
Funding statement
environment, student’s experience based on the attempt of E-Exam, E-
Exam system challenges, and suggestions for the requirements for E-
None.
Exam.

Data availability statement


6. Conclusion

None.
This research underwent an empirical investigation to examine stu­
dents’ attitudes regarding the E-Exam system in higher education in
Saudi Arabian universities. To investigate student perceptions of the E-
Exam system for higher education in Saudi Arabian institutions, as well
as its influence on service quality. The proposed approach was imple­
mented in three stages. The developed questionnaire was given to stu­ Table 19
dents in KSA universities in the second phase. All of the questions in the Analysis of path graph.
questionnaire are made mandatory. As a result, the students are ex­ Construct R2 R2adj
pected to fill in as much exact information as feasible. The comments E-Exam implementation 0.0000 0.0000
from concerned students affiliated with KSA universities were analyzed Factors impacting E-Exam environment 0.3837 0.3775
in the third phase. The evaluation was carried out in SEM and ANOVA, Student’s experience based on the Attempt of E-Exam 0.2525 0.2449
E-Exam system challenges 0.2891 0.2819
respectively. The viability of the E-Exams paradigm in a supervised
Suggestions for the requirements for E-Exam 0.1202 0.1113
exam room scenario was successfully shown through this seed project.

Table 18
Analysis of Path Design Matrix.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam

E-Exam implementation 0 1 1 1 1
Factors impacting E-Exam 0 0 0 0 0
environment
Student’s experience based on the 0 0 0 0 0
Attempt of E-Exam
E-Exam system challenges 0 0 0 0 0
Suggestions for the requirements 0 0 0 0 0
for E-Exam

10
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

Table 20
Analysis of path coefficients.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam

E-Exam implementation 0 0.6194 0.5025 0.5377 0.3468


Factors impacting E-Exam 0 0 0 0 0
environment
Student’s experience based on the 0 0 0 0 0
Attempt of E-Exam
E-exam system challenges 0 0 0 0 0
Suggestions for the requirements 0 0 0 0 0
for E-Exam

Table 21
Analysis of indirect effects.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam

E-Exam implementation 0 0 0 0 0
Factors impacting E-Exam 0 0 0 0 0
environment
Student’s experience based on the 0 0 0 0 0
Attempt of E-Exam
e-exam system challenges 0 0 0 0 0
Suggestions for the requirements 0 0 0 0 0
for E-Exam

Table 22
Analysis of total effects.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam

E-Exam implementation 0 0.6194 0.5025 0.5377 0.3468


Factors impacting E-Exam 0 0 0 0 0
environment
Student’s experience based on the 0 0 0 0 0
Attempt of E-Exam
E-Exam system challenges 0 0 0 0 0
Suggestions for the requirements 0 0 0 0 0
for E-Exam

Table 23
Analysis of inter-construct correlation.
Construct E-Exam Factors impacting E-Exam Student’s experience based on E-Exam system Suggestions for the
implementation environment the Attempt of E-Exam challenges requirements for E-Exam

E-Exam implementation 1 0 0 0 0
Factors impacting E-Exam 0.6194 1 0 0 0
environment
Student’s experience based on the 0.5025 0.6324 1 0 0
Attempt of E-Exam
E-Exam system challenges 0.5377 0.6426 0.7321 1 0
Suggestions for the requirements 0.3468 0.5608 0.6294 0.7228 1
for E-Exam

CRediT authorship contribution statement Acknowledgment

G. Gokulkumari: Conceptualization, Visualization, Methodology, The authors extend their appreciation to the Deputyship for Research
Writing – original draft. Thamer Al-Hussain: Writing – review & edit­ & Innovation, Ministry of Education in Saudi Arabia for funding this
ing. Syed Akmal: Writing – review & editing. Prakash Singh: Writing – research work through the project number 7886.
review & editing.
References
Declaration of Competing Interest
[1] Khan MA, Vivek V, Khojah M, Nabi MK, Paul M, Minhaj SM. Learners’ perspective
towards E-Exams during COVID-19 outbreak: evidence from higher educational
The authors declare that they have no conflict of interest. institutions of India and Saudi Arabia. Int J Environ 2021;18(12):6534.
No conflicts of interest. [2] Yousuf Shraim K. Online examination practices in higher education institutions:
learners’ Perspectives. Turk Online J Distance Educ 2019;20(4).

11
G. Gokulkumari et al. Advances in Engineering Software 173 (2022) 103195

[3] Fluck A, Adebayo OS, Abdulhamid SM. Secure E-Examination systems compared: [22] Figueroa-Cañas J, Sancho-Vinuesa T. Early prediction of dropout and final exam
case studies from two countries. J Inf Technol Educ Innov Pract 2018;16:107–25. performance in an online statistics course. IEEE Rev Iberoam Tecnol Aprendiz
http://www.informingscience.org/Publications/3705. 2020;15(2):86–94. https://doi.org/10.1109/RITA.2020.2987727. May.
[4] Chirumamilla A, Sindre G. E-Exams in Norwegian higher education: vendors and [23] Shen J, Hiltz SR, Bieber M. Learning strategies in online collaborative
managers views on requirements in a digital ecosystem perspective. Comput Educ examinations. IEEE Trans Prof Commun 2008;51(1):63–78. https://doi.org/
2021;172:104263. 1. 10.1109/TPC.2007.2000053. March.
[5] Ahmed FRA, Ahmed TE, Abu-Zinadah H. Analysis and challenges of robust E- [24] Trussell HJ, Gumpertz ML. Comparison of the effectiveness of online homework
Exams performance under COVID-19. Results Phys 2021;23:103987. with handwritten homework in electrical and computer engineering classes. IEEE
[6] Elsalem L, Al-Azzam N, Kheirallah KA. Stress and behavioral changes with remote Trans Educ 2020;63(3):209–15. https://doi.org/10.1109/TE.2020.2971198. Aug.
E-Exams during the COVID-19 pandemic: a cross-sectional study among [25] Shen J, Hiltz SR, Bieber M. Collaborative online examinations: impacts on
undergraduates of medical sciences. Ann Med Surg 2020;60:271–9. interaction, learning, and student satisfaction. IEEE Trans Syst Man Cybern Part A
[7] Sukadarmika G, Hartati RS, Linawati, Sastra NP. Introducing TAMEx model for Syst Hum 2006;36(6):1045–53. https://doi.org/10.1109/TSMCA.2006.883180.
availability of E-Exam in a wireless environment. In: Proceedings of the Nov.
international conference on information and communications technology [26] Hernandez J, Rodríguez F, Hilliger I, Pérez-Sanagustín M. MOOCs as a remedial
(ICOIACT); 2018. p. 163–7. https://doi.org/10.1109/ICOIACT.2018.8350741. complement: students’ adoption and learning outcomes. IEEE Trans Learn Technol
[8] Böhmer C, Feldmann N, Ibsen M. E-Exams in engineering education — online 2019;12(1):133–41. https://doi.org/10.1109/TLT.2018.2830373. 1 Jan.-March.
testing of engineering competencies: experiences and lessons learned. In: [27] Gaspar Martins S. Weekly online quizzes to a mathematics course for engineering
Proceedings of the IEEE global engineering education conference (EDUCON); students. Teach Math Appl Int J IMA 2017;36(1):56–63. https://doi.org/10.1093/
2018. p. 571–6. https://doi.org/10.1109/EDUCON.2018.8363281. teamat/hrw011. March.
[9] Sindre G. Code writing vs code completion puzzles: analyzing questions in an E- [28] Xu Z, Yuan H, Liu Q. Student performance prediction based on blended learning.
Exam. In: Proceedings of the IEEE frontiers in education conference (FIE); 2020. IEEE Trans Educ 2021;64(1):66–73. https://doi.org/10.1109/TE.2020.3008751.
p. 1–9. https://doi.org/10.1109/FIE44824.2020.9273919. Feb.
[10] Aksoy A, Ledet JW, Gunay M. Design considerations of a flexible computer-based [29] Llamas-Nistal M, Mikic-Fonte FA, Caeiro-Rodríguez M, Liz-Domínguez M.
assessment system. In: Proceedings of the 4th International conference on Supporting intensive continuous assessment with BeA in a flipped classroom
computer science and engineering (UBMK); 2019. p. 1–6. https://doi.org/ experience. IEEE Access 2019;7:150022–36. https://doi.org/10.1109/
10.1109/UBMK.2019.8907044. ACCESS.2019.2946908.
[11] Aljader HKS. A comprehensive methodology to develop an efficient electronic [30] Luo H, Luo J. Robust online orientation correction for radiographs in PACS
learning management system that is compatible with various applications. In: environments. IEEE Trans Med Imaging 2006;25(10):1370–9. https://doi.org/
Proceedings of the global conference for advancement in technology (GCAT); 10.1109/TMI.2006.880677. Oct.
2019. p. 1–6. https://doi.org/10.1109/GCAT47503.2019.8978334. [31] Kim Y, Jeong S, Ji Y, Lee S, Kwon KH, Jeon JW. Smartphone response system using
[12] Massing T, Schwinning N, Striewe M, Hanck C, Goedicke M. E-assessment using Twitter to enable effective interaction and improve engagement in large
variable-content exercises in mathematical statistics. J Stat Educ 2018;26:174–89. classrooms. IEEE Trans Educ 2015;58(2):98–103. https://doi.org/10.1109/
[13] Chirumamilla A, Sindre G, Nguyen-Duc A. Cheating in E-Exams and paper exams: TE.2014.2329651. May.
the perceptions of engineering students and teachers in Norway. Assess Eval High [32] Tian Y, Huang WN, Zhang L, Wang T. Disclosing personal names in screen names
Educ 2020;45(7):940–57. predict better final achievement levels in massive open online courses. IEEE Access
[14] Nsor-Ambala R. Impact of exam type on exam scores, anxiety, and knowledge 2021;9:50926–38. https://doi.org/10.1109/ACCESS.2021.3069451.
retention in a cost and management accounting course. Account Educ 2020;29: [33] Khan MA, Vivek V, Khojah M, Nabi MK, Paul M, Minhaj SM. Learners’ perspective
32–56. towards E-Exams during COVID-19 outbreak: evidence from higher educational
[15] Keen J, Salvatorelli A. Principles and practice of engineering exam pass rate by institutions of India and Saudi Arabia. Int J Environ Res Public Health 2021;18
gender. Eng Stud 2018;10:158–68. (12):6534. https://doi.org/10.3390/ijerph18126534. Jun 17 PMID: 34204429
[16] Jung IY, Yeom HY. Enhanced security for online exams using group cryptography. PMCID: PMC8296437.
IEEE Trans Educ 2009;52(3):340–9. https://doi.org/10.1109/TE.2008.928909. [34] Binnahedh IA. E-assessment: Wash-Back Effects and challenges (examining
Aug. students’ and teachers’ attitudes towards E-tests). Theory Pract Lang Stud 2022;12
[17] Muzaffar AW, Tahir M, Anwar MW, Chaudry Q, Mir SR, Rasheed Y. A systematic (1):203–11.
review of online exams solutions in E-learning: techniques, tools, and global [35] Or C, Chapman E. Development and acceptance of online assessment in higher
adoption. IEEE Access 2021;9:32689–712. https://doi.org/10.1109/ education: recommendations for further research. J Appl Learn Teach 2022;5(1).
ACCESS.2021.3060192. [36] Adanır GA. Student acceptance of online proctored exams. Design and
[18] Parent DW. Improvements to an electrical engineering skill audit exam to improve measurement strategies for meaningful learning. IGI Global; 2022. p. 212–29.
student mastery of core EE concepts. IEEE Trans Educ 2011;54(2):184–7. https:// [37] Binnahedh IA. E-assessment: Wash-Back effects and challenges (examining
doi.org/10.1109/TE.2010.2042451. May. students’ and teachers’ attitudes towards E-tests). Theory Pract Lang Stud 2022;12
[19] Atoum Y, Chen L, Liu AX, Hsu SDH, Liu X. Automated online exam proctoring. (1):203–11.
IEEE Trans Multimed 2017;19(7):1609–24. https://doi.org/10.1109/ [38] Ahmed FRA, Ahmed TE, Saeed RA, Alhumyani H, Abdel-Khalek S, Abu-Zinadah H.
TMM.2017.2656064. July. Analysis and challenges of robust E-Exams performance under COVID-19. Results
[20] Pardines I, Sanchez-Elez M, Martínez DAC, Gómez JI. Online evaluation Phys 2021;23:103987.
methodology of laboratory sessions in computer science degrees. IEEE Rev Iberoam [39] Shraım K. Online examination practices in higher education institutions: learners’
Tecnol Aprendiz 2014;9(4):122–30. https://doi.org/10.1109/ perspectives. Turk Online J Distance Educ 2019;20(4):185–96. https://doi.org/
RITA.2014.2363003. Nov. 10.17718/tojde.640588.
[21] Balaha HM, Saafan M. Automatic Exam Correction Framework (AECF) for the
MCQs, essays, and equations matching. IEEE Access 2021;9:32368–89. https://doi.
org/10.1109/ACCESS.2021.3060940.

12

You might also like