You are on page 1of 148

EMPLOYING THE PDSA CYCLE TO IMPROVE THE QUALITY OF

BLENDED LEARNING IN BASIC MEDICAL SCIENCE COURSES

____________

A Thesis

Presented

To the Faculty of

California State University Dominguez Hills

____________

In Partial Fulfillment

of the Requirements for the Degree

Master of Science

In

Quality Assurance

____________

By

Areig Y. Al-Ramadin

Fall 2018
Copyright by

AREIG AL-RAMADIN

2018

All Rights Reserved


TABLE OF CONTENTS

PAGE

COPYRIGHT PAGE .......................................................................................................... ii

TABLE OF CONTENTS ................................................................................................... iii

LIST OF TABLES .............................................................................................................. v

LIST OF FIGURES ........................................................................................................... vi

ABSTRACT ...................................................................................................................... vii

CHAPTER

1. INTRODUCTION .......................................................................................................... 1

Background ............................................................................................................. 1
Higher Education Quality Models in Saudi Arabia ................................................ 4
Statement of the Problem ........................................................................................ 7
Purpose of the Study ............................................................................................... 9
Theoretical Bases and Organization ..................................................................... 10
Limitations of the Study........................................................................................ 14
Definition of Terms............................................................................................... 15

2. REVIEW OF THE LITERATURE .............................................................................. 16

Introduction ........................................................................................................... 16
Technology in Higher education ........................................................................... 16
Blended Learning .................................................................................................. 21
Khan’s Octagonal Framework .............................................................................. 30
The Application of the Deming’s Cycle in Higher Education .............................. 32
Diffusion of Innovation Theory ............................................................................ 37
Application of the Literature ................................................................................. 38

3. METHODOLOGY ....................................................................................................... 39

Design of the Investigation ................................................................................... 39


Step 1: Plan ........................................................................................................... 41
Sample Selection ................................................................................................... 44
Student Recruitment Criteria ................................................................................ 48

iii
CHAPTER PAGE

Stage 2: Do ............................................................................................................ 49
Instrument and Data Collection ............................................................................ 49
Survey Validity and Reliability ............................................................................ 52
Recruitment Methods ............................................................................................ 53
Retrospective Cohort Analysis ............................................................................. 53
Focus Group .......................................................................................................... 55
Study Location ...................................................................................................... 58
Data Analyses ....................................................................................................... 59

4. RESULTS AND DISCUSSION ................................................................................... 60

Stage 3: Study/Check ............................................................................................ 60


Self-Administered Survey Analyses ..................................................................... 61
Demographic Data ................................................................................................ 61
Survey Item by Item Analysis............................................................................... 62
Domains Analyses ................................................................................................ 76
Regression Analysis of Survey Items ................................................................... 76
Analysis of the Open-ended Questions ................................................................. 83
Retrospective Analysis.......................................................................................... 83
Focus Group Data Analysis .................................................................................. 88
Course Site Characteristics ................................................................................... 89
Course Site Advantages ........................................................................................ 90
Course Site Disadvantages .................................................................................... 92

5. SUMMARY, CONCLUSION, AND RECOMMENDATIONS .................................. 94

Summary ............................................................................................................... 94
Conclusion ............................................................................................................ 97
Recommendations ................................................................................................. 99
Stage 4: Act ........................................................................................................... 99

REFERENCES ................................................................................................................... 0

APPENDICES ................................................................................................................ 101

A: BLENDED COURSE STUDENT SURVEY ................................................ 102


B: CONSENT TO ACT AS A RESEARCH SUBJECT .................................... 132
C: RIYADH ELM UNIVERSITY IRB APPROVAL LETTER ........................ 132

iv
LIST OF TABLE

PAGE

1. The Thinking Part..........................................................................................................41

2. Course Site Evaluation Criteria.....................................................................................47

3. Themes and Codes of the Thematic Analysis................................................................58

4. Survey Results for Items 2 – 10.....................................................................................64

5. Questions 20, 21, and 22................................................................................................66

6. Survey Results for Items 11-19…..................................................................................68

7. Survey Results for Items 23 to 33…..............................................................................70

8. Multiple Choice Questions............................................................................................74

9. Regression Analysis of the Five Blended Learning Domains.......................................78

10. Regression analysis of the five survey items with the strongest effect........................79

11. Descriptive statistics of the satisfaction per respondent and per domain....................81

12. Descriptive Statistics of Students’ Perception of Domains per Gender......................82

13. Percentages of Students’ Outcomes (2008-2017)........................................................84

14. Correlations of Academic Year with Students’ Outcomes


and General Satisfaction for the Course BICH 221 (2008-2017).........................85

15. Descriptive and Analytical Statistics of Students’ Outcomes


and Satisfaction with and without Blended Learning..........................................86

v
LIST OF FIGURES

PAGE

1. Khan’s Octagonal Framework…..................................................................................31

2. PDSA Cycle…..............................................................................................................35

3. Model for Quality Improvement…...............................................................................40

4. Cause-and-Effect Diagram…........................................................................................43

5. The Percentages of Male and Female Respondents…..................................................62

6. Percentages of Students’ Responses per Domain….....................................................80

7. A Comparison between Male and Female per Domain…............................................83

vi
ABSTRACT

The present period has witnessed the emergence of many contemporary learning

models because of the digital revolution. Due to the prevalence of learning models, a

comprehensive understanding of these educational practices and an assessment

framework are crucial in terms of their design and implementation. The motive behind

this research is to point out that enforcing the concepts of continuous improvement to

learning assessment can strengthening the learning process and the commitment to

quality education. This study applied the Plan-Do-Study-Act cycle as a tool to assess the

effectiveness of a blended learning environment at Riyadh Elm University. The findings

indicate how the digital learning platform has improved students’ learning experiences.

Overall, the implementation of blended learning may enhance university performance

and the quality of education, taking into consideration that the successful application of

such a tool demands a radical change in the university’s culture and possibly curriculum.
1
CHAPTER 1

INTRODUCTION

Background

The last two decades have witnessed several social transformations in the areas of

information, communications, networks, computers, markets, and nationalism. In

response to these changes, technology has become an indispensable element of our life,

whether it is in the field of medicine, agriculture, psychology, transportation,

entertainment, or education. Technology has significantly penetrated the education

system, producing considerable and sweeping changes in the sector. Universities and

colleges are embracing the new technological possibilities in order to expand the

traditional teaching methodologies and promote active learning mechanisms. The

integration of technology in the educational process has become a major demand,

providing an innovative learning environment with highly customized learning settings.

Significant improvements in teaching and learning can be achieved using technology as a

means of introducing fundamental structural changes.

In today’s world, technology is not considered a privilege anymore. In contrast, it

has become a prerequisite, a necessity for addressing new challenges in the higher

education sector. In spite of growing institutional autonomy, precisely in the private

higher education sector, there are some obvious boundaries to what Higher Educational

Institutions (HEIs) can do (Scott, Gallacher, & Parry, 2017). Governments and other

accreditation agencies are continuously imposing new regulations and requirements


2
under which the operations and services of academic institutions will be evaluated. The

integration of technology in education is one of the newest aspects to be addressed during

the evaluation and accreditation processes. This is because technology should satisfy the

demands of a variety of students looking for top-notch education. Recently, many

accrediting bodies have emerged to assess higher education institutions, improve learning

strategies, and maintain a good quality education.

However, HEIs are gearing their efforts towards achieving a cost-effective

strategy in post-secondary education, providing distinguished learning opportunities and

equipping students to compete in a knowledge-based economy. Applying technological

tools in the educational process can expand learning in dynamic ways and contribute to

facilitating communication between the teacher and the learner. The interest in

Information and Communications Technology (ICT) by educational institutions, whether

in developed or developing countries, is evidence of the development and evolution of

individuals and societies. A report by the US Department of Education (2017) discussed

the role of technology in improving and enhancing the students’ learning experiences in

which technology can provide students with the chance to enter new learning

opportunities, regardless of the traditional obstacles of place and time.

In addition, technology can enable students to acquire learning outside the formal

HEIs’ boundaries and it permits students to gain access to high-quality learning resources

(e.g., laboratory supplies, e-books). Moreover, it enables reinforcement traditional

learning experiences through digital learning models such as blended learning.

Furthermore, it provides students with the capability to personalize their learning


3
experiences with respect to their individual needs, and it can guarantee the involvement

of students with special-needs (US Department of Education, 2017). Shirky (2008) noted

that the assimilation of technology in education has a prominent role in enhancing the

student learning experience to become a multiversity educational environment. Ghadiri et

al. (2014) also mentioned that some students believe that technology can provide new

learning tools, instant feedback, and in-person instructions.

In the framework of higher education, it is well known that the classic, basic role

of higher educational institutions is educating youth. Strobl (2007) indicated that learning

is a social process composed of several strategies for delivering an efficacious learning

experience. For many years, educational institutions utilized classroom-based learning for

teaching and learning in which knowledge is transferred in a one-to-few or one-to-one

setting. Conventional education commonly takes place when the learner engages with the

teacher in a synchronous setting via face-to-face communication. In contrast, the digital

learning or the e-learning approach promotes personalization and occurs in an

asynchronous setting (Jen-Her, Robert, & Tzyh-Lih, 2010).

The provision of new educational systems such as online learning led to an

increase in flexibility in the school curriculum, communication, and opportunities for

access; still, there are some restraints with fully-online courses in terms of the lack of

participation, social contact, and options (Holley & Oliver, 2010). The idea of blended

learning implies “the organic integration of thoughtfully selected and complementary

face-to-face and online approaches and technologies” (Garrison & Vaughan, 2008, p.

148). Most recent interpretations of blended courses imply that this pedagogical paradigm
4
provides possibilities for developing course content, problem-solving skills, social

interaction, collaborative learning, high-order thinking, and student engagement

(Graham, 2006; Norberg, Dziuban, & Moskal, 2011). Blended learning emerged as a

compromise solution by employing the strengths of both learning systems, online and

face-to-face learning (Garrison & Vaughan, 2008; Picciano, 2009).

The government of Saudi Arabia is continually seeking to take advantage of

technological advances in education. The 2000s witnessed the emergence of e-learning in

several HEIs in the Kingdom. The system of Saudi Higher Education began to adopt the

application of blended learning as a way to improve the quality and capacity of its

institutions (Alebaikan, 2010). Although many universities begun to apply blended

learning, it is still implemented without a clear framework to ensure the quality of such

system. Thus, this study is presented to investigate and assess the quality of a blended

learning experience in Saudi Arabia.

Higher Education Quality Models


in Saudi Arabia

In 2004, the Higher Education Council in the Kingdom of Saudi Arabia

established the National Commission for Academic Accreditation and Assessment

(NCAAA), currently known as the Education Evaluation Commission-Sector of

Evaluation and Academic Accreditation (EEC-SEAA). The establishment of this

commission is part of the Kingdom’s aspirations for the future of the quality of higher

education (NCAAA, 2010). The commission summarizes its strategy in this vision

statement: “to encourage, support and evaluate the quality of post-secondary institutions
5
and the programs they offer” (p. 5), while it aims to accomplish the following: 1)

guarantee the quality of learning outcomes; 2) monitor the efficiency of administrative

and support services; 3) evaluate the quality and community contributions made by the

HEIs (NCAAA, 2010). The NCAAA has considered five primary contexts in the Quality

Assurance and Accreditation Handbook, which are the learning/teaching context, the

institutional context, the student support context, the community contribution context,

and the supporting infrastructure context (Gueorguiev, 2006).

The NCAAA designed and launched the National Qualifications Framework

(NQF) (NCAAA, 2009). The NQF is an essential reference for the establishment of

quality assurance standards and serves as a manual for guiding HEIs with regard to

quality assurance protocols and activities. HEIs in the Kingdom are continuously seeking

to improve their practices by adopting a global system of quality assurance and

accreditation that is widely recognized and equivalent to high global standards such as

the NQF. The framework is developed to maintain coherence concerning the standards of

learning outcomes in the Kingdom. The NQF aims to assist HEIs in conducting strategic

planning and self-review processes through providing them with suitable points of

comparison. Academic programs emerging within this framework must include specific

knowledge and skills required for professional practices within Saudi Arabia and give

consideration to cultural norms and educational policies in the region. The framework

organizes the types of students learning into four areas and characterizes learning

outcomes for each area, in addition to psychomotor skills (if applicable). The domains
6
are: “knowledge, cognitive skills, interpersonal skills, and communication/information

technology/numerical skills” (NCAAA, 2009, p. 4).

According to NCAAA, the NQF model for higher education should demonstrate a

“commitment to the analysis of knowledge, lifetime learning, and the utilization of

continuous quality improvement” (p. 3). By implementing measurable objectives, HEIs

must be fostered to evaluate and make use of statistical and mathematical data, examine

issues using comprehensive ICT tools, and communicate conclusions and

recommendations.

Riyadh Elm University (REU), previously known as Riyadh Colleges of Dentistry

and Pharmacy (RCsDP), is located in Riyadh, Kingdom of Saudi Arabia. It offers

Bachelor’s degree programs in Dentistry, Dental Hygiene, Dental Assistant, Clinical

Pharmacy, Pharm D, Nursing, and Medical Laboratory Technology. In addition, it

provides several diplomas, postgraduate programs, internships, continuous education

programs, Saudi Board programs, and healthcare services. In the 2014/15 academic year

(AY), REU served approximately 3,000 students, 80% of whom were registered in dental

programs and 20% in others, while 92% were studying in undergraduate programs. The

proportion of female students in the undergraduate program was 61% and the male

students’ proportion was 39%. Teaching and the delivery of information is presently

being accomplished through different methods, including traditional teaching methods

(e.g., face-to-face). In addition, there is an online learning management platform known

as Moodle (Modular Object-Oriented Dynamic Learning Environment). This system

provides bidirectional communication between the instructor and learners. In recent


7
years, REU made significant steps towards the digitalization of its teaching and learning

processes. The college has created a blended learning environment based on Moodle as a

means of improving the delivery of courses and for students to take exams. Initially, there

was some resistance on the part of both faculty and students regarding implementation of

this system. However, it must be recognized that blended learning contributes to

overcoming some of the current difficulties in the education sector in Saudi Arabia

(Garrison & Vaughan, 2008). At this point, there are no conclusive thoughts about the

effectiveness of blended learning at REU, with a slight movement toward continuous

improvement of the current system.

REU was involved on a reevaluation process at both levels, institutional and

programmatic, as an integral part of the primary accrediting activities. The main

functions of higher education accreditation are to evaluate the quality of academic

programs, devise a culture of continuous improvement, and involve stakeholders in

institutional planning and evaluation. In view of the fact that REU must comply with the

EEC standards and recommendations, thereby developing action plans to improve and

monitor areas of weakness.

Statement of the Problem

Education delivery technologies are advancing at an exponential rate. These

technological developments will maintain a momentous influence on higher education at

global and national levels (Johnson et al., 2012). Yet, the process of linking the learning

outcomes to the technological learning methods used has not been surely specified,

described, or fully grasped (Garrison, 2000). Keeping pace with such advancements has
8
become a significant challenge for academic institutions in view of the continuously

increasing costs of new technology and the highly competitive market. Nor can we forget

to mention here the growing governmental pressures that face HEIs in terms of changing

regulations and diminishing sources of income, as well as the worldwide financial crisis.

Therefore, HEIs are constantly seeking to deliver high-quality education in the most cost-

effective way. The integration of technology in education emerges as a glimmer of hope

in facing the aforementioned challenges.

With the very increased requirements of government regulations and

accreditation, it becomes necessary for universities and colleges to keep up with these

compulsory requirements to promote prosperity and development of their performance.

In line with this situation, REU is currently going through the process of renewal of

accreditation by the EEC at the Kingdom of Saudi Arabia. The EEC panel reported that

e-learning has been made almost compulsory for all courses and course directors.

However, the review panel deems it necessary that REU should establish an appropriate

mechanism to evaluate and monitor the current blended learning system, strictly monitor

the software system, and present more evidence. However, at REU there is no

standardized process for evaluating the performance of blended learning. Furthermore,

there is no performance benchmark for the implementation of this system. Therefore, a

framework must be established for real-time monitoring and improvement of the blended

learning environment at REU. Accordingly, this research was formulated to investigate

the efficacy of blended learning as the primary method of education in terms of junior

medical students’ perceptions and performance.


9
Purpose of the Study

HEIs and post-secondary instructors can utilize technological tools to provide a

cutting-edge learning environment for learners that support and improve current learning

systems. Additionally, teachers will use robust tools to design collaborative learning

experiences. However, technology presents both benefits and demands to teachers and

learners. Haughey and Anderson (1998) proposed that the benefits of technology in

education emanate from its potential to connect users with each other and with

information, while complications with interactivity, build, complexity, consumption of

time, and security may arise.

Due to the lack of evidence and performance benchmarks, REU needs to maintain

and boost the level of confidence of REU’s stakeholders, thus establishing efficient

quality assurance activities to ensure courses within the blended learning environment are

well designed and regularly monitored. This research was proposed as an action plan in

light of the EEC recommendation with regard to the e-learning system and the need to

attain continuous quality improvement in the blended learning system at REU. In

addition, the study may provide REU stakeholders with significant data concerning the

effectiveness of the currently implemented learning management system and assist

instructors in developing a quality framework to advocate the blended learning approach.

Moreover, the research helps identify the capacity of blended learning regarding

achievement of the goals of course learning outcomes. This research can contribute to the

development of pedagogy and Learning Management System (LMS) in the private sector
10
in Saudi universities regarding the application of blended learning to improve the quality

of higher education in Saudi Arabia.

Theoretical Bases and Organization

Vast amounts of research have been performed to understand and review blended

learning in higher education. Educators realize the considerable challenges facing higher

education today and thus develop systematic approaches to deal with the advances in

technology and science. The shift from traditional learning methods to digitized learning

methods will create new student-learning opportunities. Meanwhile, the availability of

various learning approaches must be assessed and monitored as part of the quality

assessment process. In addition, the integration of technology in education should

constitute an essential component of HEIs’ strategies for exploring new teaching and

learning opportunities. This study addresses a new technological mode of teaching and

learning implemented at REU, namely blended learning. In addition, it utilizes the Plan-

Do-Study-Act (PDSA) cycle to provide a comprehensive approach in helping to apply a

change in the current practices and recommend quality improvements.

This research was designed primarily to address the hypothesis that blended

learning affects the learning process positively. Therefore, a continuous improvement

approach has been implemented to evaluate a blended learning environment at REU. As

well, this research aims to investigate students’ perceptions of blended learning at REU

and suggest a framework for continuous improvement. Several studies have been

completed since the blended learning approach was popularized in the education industry.

Most of these studies have demonstrated that the implementation of a well-designed


11
blended learning system can support the educational process and help in improving

student satisfaction. Particularly, this section deals with the primary hypothesis that

blended learning can affect the learning and teaching processes positively. Various

teaching instruction methods must be assessed and monitored as part of the learning

assessment process. As a continuous improvement approach, the PDSA cycle has been

implemented to evaluate a blended learning environment.

The PDSA continuous improvement model presents a holistic approach to

applying a change in current practices and recommending some quality improvements.

The PDSA cycle is among the most commonly utilized methods for quality assurance and

continuous improvement in higher education (Knight & Allen, 2012; Shokraiefard, 2011;

Sokovic et al., 2010). Many researchers pointed out that the PDSA cycle can be used for

enhancing the quality of courses and learning models in higher education (Brown &

Marshall, 2008; Aggarwal & Lynn, 2012). The scope of this study expands current

knowledge and investigates research into the continuous improvement of blended

learning. The PDSA cycle was implemented with the intent of improving the current

blended learning environment parallel to the NQF of Saudi Arabia (NCAAA, 2009). This

study corresponds to contemporary literature and modern education technologies,

particularly reinforcing the effectiveness of blended learning through applying a

continuous improvement and assessment cycle.

Implementing the PDSA cycle in this situation may help REU to develop a

continuous loop of development in the blended learning environment and meet the

students’ needs over the lifetime of their learning experience, rather than the adoption of
12
an assessment process that is unsubstantiated and has no long-term goals. The PDSA

cycle (Deming, 1986) involves a four-step process:

 Plan: This stage starts by recognizing the main problem, collecting


information, and suggesting action plans for solution.

 Do: Action plans must be implemented and verified.

 Study: Outcomes must be continuously examined to analyze the impacts of


the implemented action plans.

 Act: Implementing the action plans in accordance with the previous outcomes
and engaging all stakeholders.

Each PDSA cycle represents a single step in the continual improvement tool

implementation. The duration of each cycle must be as concise as possible for studying

the impacts of an action plan (e.g., per semester).

The process of developing a dynamic learning environment relies largely on the

leadership of the institution and strategic alignment with learning objectives, alongside

the government performance in controlling and guiding the diffusion of technology in

higher education. Many researchers have investigated the most significant success

indicators for technology diffusion into the learning system of HEIs. These indicators

include: leadership, strategic planning and alignment, innovation in teaching, appropriate

organizational structure, technical infrastructure, e-learning feasibility, dedicated change

agent, and the quality assurance of e-learning (Divjak & Ređep, 2015; Begičević et al.,

2007).

Scholars and researchers recognize the various definitions of blended learning;

therefore, it is necessary at this stage to investigate the effectiveness of blended learning


13
from the students' point of view. This study focuses not on the institution’s or the

instructors’ perception of blended learning but on the students’ perception. Students’

satisfaction has been identified as one of the most important factors in evaluating the

quality of blended learning (Abou-Naaj et al., 2012; Small et al., 2012; Sher, 2009;

Wang, 2003). User satisfaction is one of the most common indicators among researchers

and academics to identify and evaluate the effectiveness of the education system (Wu &

Liu, 2013; Arbaugh, 2014). Previous research has considered that satisfaction with

learning symbolizes the aggregate of student's opinions and feelings resulting from a

student-perceived effectiveness of blended learning (Wu, Tennyson, & Hsia, 2010).

According to Wu and Liu (2013), an approach to student learning satisfaction is evolved

based on three factors: easiness, learning climate, and perceived value. Overall, domains

such as students’ satisfaction, students’ outcome, interaction, technology, instruction,

course management, and instructor were examined in this research from the perspective

of the presented theories and literature, thus providing a solid framework for the

implementation of effective learning through development of a well-structured blended

learning environment in REU. In addition, the following primary hypotheses were tested:

Hypothesis 1: The implementation of blended learning improves students’


satisfaction.

Hypothesis 2: The implementation of blended learning improves students’


outcomes.

Hypothesis 3: Satisfaction is impacted by different elements of blended learning.

Hypothesis 4: Satisfaction in blended learning varies across genders.


14
This thesis is designed to provide quality professionals with background

information about the application of the PDSA cycle in education and a resource to

initiate a sustainable assessment to the blended learning approach in higher education.

Limitations of the Study

Although there is a wealth of research on the effectiveness of blended learning in

higher education, most studies are based on conducting a quantitative research

methodology. The lack of a probability sampling technique when applying a quantitative

research design is a clear limitation of such studies. Taking into consideration the human

factor, this study is limited to focusing on a sample of undergraduate students enrolled in

one course offered at REU. The research was conducted using one course as a sample for

research, which may not represent the majority of views at REU. The generalization of

the study findings is the most sensitive limitation of this study. Furthermore, the study

will use a self-reported questionnaire that is limited by the degree of accuracy of the

participant's responses. For example, students with good performance may give more

positive feedback than failing students.


15
Definition of Terms

Blended Learning: “the combination of instruction from two historically separate models

of teaching and learning: traditional face-to-face learning systems and distributed

learning systems” (Graham, 2006, p. 5).

Education Evaluation Commission (EEC): A government agency that is financially and

administratively independent. In addition, it is the regulatory body responsible for

conducting evaluation of education in public and private institutions in the Kingdom of

Saudi Arabia (EEC, 2016).

Online Learning: learning using electronic technologies (e.g., internet) to gain access to

educational curriculum from outside of academic institution boundaries. Usually, courses

are delivered via the internet and it comprises “the separation of teacher and learner

during at least a majority of each instructional process” (Palloff, Pratt, & Stockley, 2001,

p. 5).
16
CHAPTER 2

REVIEW OF THE LITERATURE

Introduction

Many educators are interested in employing technological innovations in the

education process. This study addresses the employment of blended learning, which is

based on mixing and integrating the traditional methods of education with the use of

technological innovations. In an attempt to develop a comprehensive approach to answer

the problem of this study and support the hypothesis, the researcher developed a

systematic plan to research literature, considering the following aspects: technology in

higher education, the system of blended learning, and quality assurance in higher

education through applying the PDSA cycle.

Technology in Higher Education

This section highlights the substantial role of ICTs in education. Technological

innovations in the education sector focus on the employment of many innovative ideas

and inventions characterized by novelty and resulting from the rapid scientific and

technological developments in the world. Technological innovations change according to

time and vary according to the educational situation and environment. Technology played

a pivotal role in the education process, placing students at the core and leading to the

change of many educational policies and practices. Shirky (2008) highlighted the impact

of technology in expanding the traditional learning environment, setting the student voice

as a prominent component in the education system.


17
Nowadays, technology has become essential in framing a student-focused

approach to post-secondary education. Just as high-speed technology has developed new

and continuously expanding types of jobs and competencies that require new abilities, it

has also made great advancement in meeting the demands of a wider group of learners.

The result is that technology has created a revolution in the provision of education, thus

facilitating entry to higher education for more students at affordable cost and with greater

resilience. Nevertheless, for technological solutions to have a transformational effect on

student learning, it must be based on the goals and needs of the students. While

technology can be introduced into current systems to make them slightly more flexible

and effective, it also provides an opportunity to stimulate the reform of educational

practices and structures (U.S. Department of Education, 2017).

The integration of technological innovations in higher education should not be

viewed as a substitute to face-to-face education but as a tool for achieving desired

outcomes through improving the quality of learning/teaching, expanding access,

enhancing the productivity of the education system, and preparing competencies for the

labour market (Haddad & Jurich, 2002). In a research study conducted by Kvavik and

Caruso (2005), they reported that students actually prefer to have technology in their

courses to a moderate extent. Thus, evaluating the effectiveness of employing ICTs on a

given course is very important in making future improvements and decisions regarding

further investments in technology (Tyler, 2005). Cuban (1999) found that most

instructors and students at the post-secondary level are familiar with how to use web

pages and e-mail, yet less than 10% of instructors adapt ICTs for teaching. According to
18
Brill and Galloway (2007), “the insufficient availability of technology” and “the lack of

technology resources” (p. 29) are the two main restrictions for utilizing technology-

supported education. The employment of technological innovations in education must be

solely done in a deliberate manner, and away from randomization, in order to achieve the

required objectives.

Kim (2007) categorized learning styles into six distinct categories. The formal

learning methods include these forms: traditional learning (credited, scheduled courses

with physical classes), face-to-face course-scheduled learning (with no physical class),

distance learning (with scheduled, remotely offered courses and virtual classes), and

scheduled e-learning. The informal styles include non-credited traditional learning and

unscheduled self-e-learning.

Warschauer and Liaw (2010) noted that ICTs have not been fully utilized in

education and several researchers have concluded mixed findings regarding the effect of

technology on education. Recently, there are many existing e-learning technologies

(Garrison, 2011) that can facilitate learning mobility (Herrington et al., 2012).

The employment of technology in higher education is constantly evolving.

Accordingly, it resulted in improvements in traditional teaching and learning approaches,

the emergence of diverse digital learning methods and tools (e.g., distance learning,

blended learning, e-books, conferencing, collaborative authoring, and tools including

learning management systems), and the advancement of students’ skills (El-Mowafy,

Kuhn, & Snow, 2013; Wu et al., 2008).


19
In spite of the fact that distance learning has many advantages (e.g., expanding

access, eradicating geographical obstacles, enhancing fitness and efficacy for individual

and group learning), it also has some deficiencies, such as the shortage in face-to-face

communication, the requirement for tutorial support, and high operating cost (Wu et al.,

2008; Yang & Liu, 2007; Kinshuk & Yang, 2003). Moreover, students in distance

learning may suffer from loss of interest (Maki et al., 2000), sense of isolation, confusion

and frustration (Hara & Kling, 2000).

Actually, several scholars questioned the effectiveness of e-learning and the

degree of students’ satisfaction with this approach (Piccoli et al., 2001; Santhanam et al.,

2008). Institutions, instructors, and educational technology developers are pursuing

alternative learning solutions to mitigate the impact of defects in the distance learning

system. Graham (2006) suggested the blended learning approach as a possible solution to

overcome online learning weaknesses.

Warschauer and Liaw (2010) summarized that “new technologies also can be

used as a medium for professional development, providing educators with both accessible

information and hands-on experiences with the same tools they may later use with their

students” (p. 20). So far, ICTs have not been endorsed in research findings and education

concerning its impact on the various contemporary teaching/learning modalities.

However, the employment of ICTs worldwide is accelerating. UNESCO encourages the

use of ICT stating that “ICT can contribute to universal access to education, equity in

education, the delivery of quality learning and teaching, teachers’ professional


20
development, and more efficient education management, governance and administration”

(© UNESCO, 2002, p. 1).

The recruitment of technological innovations in the education process can be

accomplished in a systematic, far from random, way in order to achieve the objectives of

such application recruitment. Hilal and Qamar (2001) illustrated a set of factors that must

be addressed when employing technological innovations in the educational process:

 Setting goals: Each project has a set of objectives, which must be clear,
specific, and realistic and linked to the needs and demands of students
themselves.

 Identification of needs and requirements: These needs should be determined


by identifying the available human resources in terms of the availability of
technicians, technology education professionals, and management
commitment.

 Creating an appropriate learning environment: It is necessary to provide an


appropriate psychological climate for the employment of teachers for
technological innovations, or to provide the appropriate and adequate
education environment to accept these innovations and employ them
appropriately.

 Implementation and follow-up: The follow-up shall be carried on to ensure


that the project is implemented as agreed upon. The process of
implementation and follow-up includes the evaluation process, which aims to
measure the extent to which technological innovations fail or succeed in
achieving the objectives.

Since 1994, the Kingdom of Saudi Arabia has accomplished discernible progress

in enforcing ICT in higher education. This has become a hot topic around the discussion

table of nearly every HEI because of continuous technological advancement, society’s

response to these developments, and labour market needs. Therefore, the integration

process becomes an urgent need (Alfahad, 2012; Milligan, 2010; Caruso & Salaway,
21
2007). HEIs acknowledge the inevitability of investigating the current teaching practices

and the capability of ICTs. Garrison and Vaughan (2008) noted that understanding the

potential and strategies of blended learning to provide high-quality education is a

challenge.

This era has witnessed the emergence of several multimodal learning systems

because of technological developments. Multimodality is comprised of linguistic, spatial,

audio, and visual representations and face-to-face communication (Luke, 2003). Blended

learning has developed as a contemporary educational approach that couples two learning

models: face-to-face and e-learning.

Blended Learning

Many educators have directed their attention towards the employment of

technological innovations in the educational process, especially in the field of online

education. Since it is difficult to provide e-learning requirements or to fulfill its terms,

especially in some scientific fields, a compromise solution has been developed, namely

blended learning. Blended learning is based on mixing conventional teaching methods

with technological innovations (e.g., e-learning) in the education process (Voos, 2003).

This integration can maximize the benefits of using both learning methods (Graham,

2006; Harding et al., 2005). The shift from the traditional method into blended learning

does not happen overnight. However, we expect that it will promote qualitative

improvements in the management, engineering, and organization of post-secondary

education (Žuvic-Butorac et al., 2011).


22
The early 21st century features the emergence of the blended learning approach as

a contemporary learning style (Vo et al., 2017) and a widespread phenomenon in the field

of higher education (Masi & Winer, 2005). It became so inevitably obvious that blended

learning is capable for overcoming the various constraints and challenges related to both

systems, face-to-face and e-learning.

Merging traditional classroom with digital learning must be specifically designed

to achieve the greatest benefit of both methods (Güzer & Caner, 2014). Meanwhile, on-

line learning presents a modern apparatus that can augment conventional instruction. The

disadvantages of this method must be considered to improve the learning environment.

Blended learning couples the advantages of traditional classroom environment with those

of e-learning, yielding an arguably more coherent model.

This study includes a comprehensive review of the various aspects of blended

learning and its design. Several definitions of blended learning has been proposed.

Originally, Singh (2003) identified it as “a combination of multiple delivery media

designed to complement each other and promote learning and application-learned

behaviour” (p. 53). While Thorne (2003) interpreted blended learning as “a way of

meeting the challenges of tailoring learning and development to the needs of individuals

by integrating the innovative and technological advances offered by online learning with

the interaction and participation offered in the best of traditional learning” (p. 5). Later,

Masie described blended learning as “the mixture of e-learning and classroom learning”

(Masie, 2006, p. 22). The same year, Graham sought to explicate the blended learning

method as “a combination of face-to-face instruction and computer-mediated instruction”


23
(Graham, 2006, p. 5). Generally, it is an instructional learning system that merges several

learning modalities. In 2007, Kim (2007) proposed this definition: “Blended learning is

learning outside the traditional classroom using information technology for the delivery

of the learning materials” (p. 2). Blended learning is established by “combining two kinds

of learning environments, one associated with online learning and the other conventional

teacher-led classroom” (Kudrik, Lahn, & Morch, 2009, p. 2). Köse (2010) indicated that

“blended learning is a learning approach that contains different types of education

techniques and technologies” (p. 2795). Staker and Horn (2012) reported that “blended

learning is a formal education program in which a student learns, at least in part, through

online delivery of content and instruction with some element of student control over time,

place, path, and/or pace and, at least in part, at a supervised brick-and-mortar location

away from home” (p. 3). In other terms, the design of a blended learning system depends

on the desired learning-outcome objectives. Norberg, Dziuban, and Moskal (2011)

indicated that in a blended learning environment, the time factor plays a key role,

irrespective of where students are located. The availability of multiple forms of learning

models led to the emergence of the term “the post modality era,” in which learners do not

acknowledge the course setup as a specific determinant in making their decision for

course enrollment (Cavanagh, 2012). Hinssen (2010) referred to the “new normal,” in

which educators and students deem learning technologies as the prevailing learning

process rather than an addition.


24
Literature indicates that many researchers pointed out the advantages of

employing a blended learning system (Mebane et al., 2008; Harding et al., 2005; Bonk &

Graham, 2004) compared with traditional learning (Woltering et al., 2009).

Blended learning might be the very right solution to improve, expand, and modify

the imperfections of traditional learning (Alexander, 1999; Donnelly, 2010). Meanwhile,

blended learning redirects the attention from teaching towards learning in order to

develop an interactive learning atmosphere for both instructors and students. This

neoteric setting suggests a role shift for both educators and learners (Nunan, George, &

McCausland, 2000). This method grants many advantages, including the improvement of

social communication, cost reduction, ease of revision, facilitation instructional richness

(Osguthorpe & Graham, 2003), reinforcement of the student’s autonomy (Chambers,

1999; Lebow, 1993; Radford, 1997; Tam, 2000), and improvement in the student’s

achievements (Boyle et al., 2003; Lim & Morris, 2009; O’Toole & Absalom, 2003). In

addition, blended learning has modified the ways in which teachers teach and learners

learn (Graham, 2006). Albalawi (2007) stated that online-learning has a favorable effect

on improving the learning process. Alebaikan’s (2010) conducted a gender-segregated-

environment study in Saudi Universities and found that blended learning may enhance or

expand the quality of learning. Moreover, Krasnova et al. (2015) considered that the

application of blended learning could yield qualitative improvements in the curriculum

and provide learners with diverse electronic components. The extensive literature

demonstrating the various benefits of blended learning inspired the researcher to conduct

this study for the purpose of improving the current system of blended learning at REU.
25
Educationists are unreservedly positive regarding the implementation of blended

learning because it creates new dimensions to learning, since the mix of space, media and

time can offer new opportunities with regard to the types of activities that students can

implement and the ways in which they can collaborate with the available electronic tools

(Littlejohn & Pegler, 2007). Toth, Morrow, and Ludvico (2009) indicated that blended

learning has a positive impact on the development of visual and mental skills (i.e. report

writing and data reading). As well, Kaur (2013) pointed out that a blended learning

system (e.g., Moodle) permits students to learn, use, and access information using

different modalities.

This research considers different aspects that may influence the development of a

successful blended learning system. An integration of leadership commitment and a well-

designed blended learning system, together with proactive support and guidance for

learners, can facilitate homework submission and, ultimately, student retention without

increasing teaching time (Hughes, 2007). Azizan (2010) suggested the following

advantages of blended learning:

 Improves collaboration, communication, and social interaction

 Extends efficiency and flexibility

 Increases mobility and reach

 Optimizes the learning system development: costs and time.

Graham and Robinson (2007) reported that students differ in the selection of the

most appropriate learning styles in terms of their circumstances or personalities.

However, several studies indicated that the education quality and outcomes are
26
influenced by the following factors: deficient communication (Laurillard, 2002), and loss

of motivation (Lim & Kim, 2003). Actually, Dziuban and Moskal (2011) demonstrated

that students evaluate their courses in the same manner regardless of the used model of

instruction (i.e., electronic versus traditional).

Bailey (2002) studied the impact of different learning strategies on interaction,

whether student-to-student or instructor-to-student interaction. He found a favorable

impact of blended learning on improving student’s satisfaction and awareness. Many

instructors reported high degrees of students’ satisfaction in blended learning courses. As

well, the quality and amount of interaction among students themselves and with their

instructors exceeded that in face-to-face environment (Dziuban et al., 2011). In same

context, the blended learning approach serves as a platform that addresses students’ needs

and their diverse learning styles by integrating interactive web-based methods with

conventional approaches (Garrison & Kanuka, 2004; Holley & Dobson, 2008). In a meta-

analysis of 1,100 research studies released between 1995 and 2009, blended learning

proved to be more efficient and dynamic than either face-to-face or on-line learning.

Based on these findings, an item regarding students’ preference of the mode of learning

was incorporated into the survey used in this study. In addition, the students’

performances and outcomes were compared before and after the implementation of

blended learning (Means et al., 2009).

When new technology trends emerged continuously and were considered natural

and important, educators integrated them into their courses, mainly for quality and

effectiveness improvement. In addition, new technology enables learners to use their


27
capabilities and resources more efficiently when they can use all components and features

of the learning management system anywhere, anytime (Norberg et al., 2011). However,

blended learning still has some shortcomings concerning student’s learning satisfaction

(So & Brush, 2008). Since blended learning combines the positive aspects of both

models, it becomes a comprehensive learning model that provides the experience of

physical classroom formats (i.e., books, lectures, labs, and handouts) mixed with

telecommunication technologies (i.e., World Wide Web, computer, and Internet). Most

instructors do not have sufficient knowledge of blended learning system in terms of both

theoretical preparation and experimental experience. Thus, determining the utmost

convenient design for a blended learning course is a significant trial for most instructors

(Huang & Zhou, 2005). Though a vast amount of research has suggested several designs

for blended learning systems (Boyle et al., 2003; Garrison & Vaughan, 2008; Huang &

Zhou, 2005; Kenney & Newcombe, 2011), few have attempted to investigate the pros and

cons of each design (Twigg, 2003; Graham, 2009, 2012a).

In literature, researchers differed in selecting the elements of a blended learning

system. Kerres and Witt (2003) developed a model that depicts educational design

decisions concerning the components of a blended learning program, namely the 3-C

Model. This model consists of the following components:

 Communication: the communication component provides interpersonal


exchange between students and with their instructors.

 Content: the content component addresses the availability of learning material.

 Constructive: the constructive component guides and facilitates the learning


activities.
28
Baba et al. (2014) assessed the effectiveness of a blended learning experience by

studying the following four areas: overall students’ satisfaction, guidance and training,

convenience afforded, and learning outcomes. In Rahman, Hussein, and Aluwi (2015)

study, the framework is centered on five elements: “students’ satisfaction with blended

learning, perceived ease of use, perceived value, learning climate, and interaction” (p.

770).

For the purpose of studying the efficacy of the digital learning platform at REU,

several domains were delineated as influencers on the student experience in the e-

learning environment: interaction, instructor, technology (Bollinger & Martindale, 2004;

Smart & Cappel, 2006; Belanger, 1999; Bower & Kamata, 2000), course management

(Moore, 2011), and instruction (Carmel & Gold, 2007). Another important component

that has been thoroughly investigated in this research is student satisfaction. Student

satisfaction is an essential performance indicator of the student's academic and

educational experience (Paechter, Maier & Macher, 2010). Many researchers and

academics studied student satisfaction to verify the effectiveness of blended learning

(Alruwaih, 2015; Melton et al., 2009; Wu et al., 2010). According to Moore (2011),

student satisfaction is considered as a major pillar of quality in education, besides

learning effectiveness, institutional cost-effectiveness, accessibility, and faculty

satisfaction.

When developing an e-learning environment, instructors encountered several

concerns that accordingly influenced how students receive knowledge in a blended


29
learning environment. The decisions relating to the design of the learning platform

depend on five domains (Ehlers, 2004):

 Interaction domain: Blended learning instructors acknowledge the


significance of designing a learning environment that promotes dialogue,
interaction, and guidance in order to achieve learning outcomes at the same
level as traditional courses (Babb, Stewart, & Johnson, 2010). However,
Shedletsky and Aitken (2001) stressed that the quality and level of
communication between the student and the educator is a common interest in
the blended learning framework. Moreover, Brophy (1999) viewed the
interactions/communication domain as one of the major components that
affects students’ satisfaction (Azizan, 2010; Bailey, 2002; Dziuban et al.,
2011). Therefore, this study questioned students about their perceptions of
effective communication within a blended learning environment to understand
their point of view regarding their experiences. This domain includes student-
student interaction, student-instructor interaction, class collaboration, and
communication channels (Paechter, Maier & Macher, 2010; Abou-Naaj et al.,
2012).

 Technology domain: The technology domain exemplifies a modern way to


present and display course content (Abou-Naaj et al., 2012). Choosing the
right delivery medium possesses a significant impact on the course site design
(Smart & Cappel, 2006). Accordingly, the adoption of the most appropriate
media will not affect the learning goals and outcomes (Abou-Naaj et al.,
2012). Nevertheless, technology alone is not capable of creating an efficient
learning environment while neglecting theoretical backup to reinforce the
course design (Babb, Stewart, & Johnson, 2010).

 Instruction domain: Maintaining instructional quality is crucial when


designing e-learning courses (Paechter, Maier & Macher, 2010; Abou-Naaj et
al., 2012). The instruction domain aims to select the most appropriate
instructional approaches and assessment methods that promote the learning
objectives and goals (Holden & Westfall, 2006), and the instruction domain
concentrates on instructional design and considers learning outcomes, the
learner, course content, and instructional methods and strategies. De-Bourgh
(2003) found a positive correlation between student satisfaction and how well
courses are planned, designed, and taught.

 Instructor domain: The instructor is a central component in the education


process and this applies to modern learning environments. Instructor
performance has a direct impact on students’ satisfaction (Abou-Naaj et al.,
2012). There are several elements concerning the instructor in the education
30
process: instructor availability, timely feedback, effective communication, and
motivation for students (Almalki, 2011).

 Course management/administration domain: The course


management/administration aspect of the blended learning system is about the
interactive features used by the instructor in the course site to deliver the
course materials (Babb, Stewart, & Johnson, 2010). Instructors mainly used
the learning platform for resource delivery and course documents such as
syllabus publication, emails, gradebook, and online assignments or for
providing supplemental readings online (Abou-Naaj et al., 2012).

Khan’s Octagonal Framework

Singh (2003) cited a blended learning framework developed by Khan (see Figure

1). The framework is based on eight aspects: pedagogical, institutional, technological,

interface design, management, evaluation, ethical, and resource support. Each aspect in

the framework needs to be considered separately. The eight aspects aim to create a useful

learning experience.
31

Figure 1. Khan’s octagonal framework. Reprinted from Building effective blended


learning programs (p. 51), by H. Singh, 2003, Englewood Cliffs NJ: Educational
Technology-Saddle Brook. Reprinted with permission.

 The Pedagogical aspect is related to the combination of learners’ needs, the


content to be delivered, and learning goals.

 The Institutional aspect is concerned with student, administrative,


organizational, and academic affairs services.

 The Technological aspect is concerned with the technical requirements, such


as security, accessibility, software, hardware, server, requirements of the
LMS, and the program content.

 The Interface Design aspect is required to provide what users might require
and to ensure that the interface design has specifications that are easy to
accomplish in the blended learning program.

 The Program Management aspect is the most important phase in the process
of designing a blended program. It is concerned with launching, delivering,
and managing a blended learning program.

 The Evaluation aspect addresses the reliability and usability of the blended
learning program by examining students’ performance and system
effectiveness.
32
 The Resource Support aspect is concerned with providing students with
diverse types of resources and organizing them.

 The Ethical aspect deals with the ethical issues associated with diversity,
nationality, and providing students with equal opportunity.

Many researchers believe that blended learning is the future model for instruction

in higher education (Norberg et al., 2011; Ross & Gage, 2006). Despite the fact that there

are several ways of designing a blended course, whether by creating a whole course or

merely adding extra technological activities, the lack of a universally accepted definition

of blended learning led instructors to interpret it differently and thus architect blended

courses based on personal preferences (Deperlioglu & Kose, 2013; Graham, 2012b; Lee,

Fong, & Gordon, 2013; Stacey & Gerbic, 2008). Hence, the main question now is how to

develop an effective blended learning system.

In order to polish the implementation of a blended learning system, the current

study pursues the existing body of knowledge on measuring the quality and effectiveness

of blended learning systems and suggests a number of recommendations that can help in

monitoring and improving the current applied blended learning system in REU.

The Application of the Deming’s Cycle


in Higher Education

In higher education, the concept of continuous improvement may refer to an

institution’s ongoing commitment to quality development efforts to enhance processes

and practices related to student outcomes and system efficiency and effectiveness. Higher

education always faces the challenge of ensuring the quality of instruction. A possible

way to improve the quality of education lies in the use of quality tools and models
33
including Lean Six Sigma, Total Quality Management (TQM), and the Deming Cycle.

This study adopts one of the principles of the continuous improvement approach to

process and problem solving, the Plan-Do-Study-Act (PDSA) cycle, also called the

Deming Cycle (Deming, 1986), as an education methodology and a framework for

improvement, scrutiny, and enhancement of the quality of the blended learning system at

REU. As with all major industries, higher education is continuously seeking to raise the

education quality through innovation and continuous improvements.

In a highly respected research, Bhuiyan and Baghel (2005) reviewed the history

of the emergence and evolution of the Continuous Quality Improvement (CQI) approach.

Originally, Shewhart suggested the CQI as a business philosophy (Shewhart, 1939). The

American Society for Quality describes the term “continuous improvement” as an

approach or process to problem solving on an ongoing basis to improve processes,

products, and services (American Society for Quality, n.d.). Several enhancement

approaches and models rest on Shewhart’s four-phase approach, including Deming’s

Cycle, Lean, and Define-Measure-Analyze-Improve-Control (DMIAC) (Park et al., 2013;

Flumerfelt & Green, 2013). The employment of the PDSA cycle has improved with time

from a tool for supporting statistical control to a problem-solving method (Milgram,

Spector & Treger, 1999; Revelle, 2004). All of these models, in fact, have been used in

higher education due to the intense focus on expanding the quality of education in on-

going educational reforms. Particularly, this study adopted the Deming model to study

the effectiveness of the blended-learning system at REU and to investigate the system

weaknesses in order to formulate constructive solutions. However, the PDSA cycle is


34
similar to other models of continuous improvement in terms of using a cyclic process to

transform possible remedies into actions for continuous quality improvements.

In the 50s, Deming modified Shewhart’s four-stage approach, which later became

the PDSA cycle (Deming, 1986). The PDSA cycle, a long-standing model, is based on a

set of facts and data analysis. The Deming’s Cycle illustrates each of the four phases as

follows: “Plan: study a problem, collect data, define goals and objectives; Do: Identify

needs, propose a change, and implement a solution; Check: Monitor and evaluate the

intervention; Act: Adopt, adapt, or refine and reinstitute” (Brown & Marshall, 2008, p.

207).

The PDSA Cycle, also known as the Deming Wheel or Deming Cycle (Deming,

1986), is “a systematic series of steps for gaining valuable learning and knowledge for

the continual improvement of a product or process” (The W. Edwards Deming Institute®,

2018). See Figure 2.


35

Figure 2. “PDSA cycle and Model for Improvement—1991, 1994”. Reprinted from
Circling back (p. 22), by R. D. Moen, and C. L. Norman, 2010, Quality Progress.
Copyright 2010 by the BMJ Publishing Group Ltd. Reprinted with permission.

According to the American Society for Quality website (n.d.), “continual

improvement (CI) is the ongoing improvement of products, services or processes through

incremental and breakthrough improvements.” The PDSA cycle commonly used methods

in quality assurance and continuous improvements in higher education. In fact, the PDSA

cycle is frequently used for initial accreditation or reevaluation to establish a continuous

assessment that aims to adopt the results and evidences collected for administrative

decisions (Gazza, 2015). The vast majority of researches have proved that by employing

the PDSA problem-solving cycle, several advantages have been recorded, such as

improved student achievement and behavior, increased instructional time, and

ameliorated hallway transitions (Wheeless, 2009). In a case study conducted by

Shokraiefard (2011) in the Engineering School at Boras University, Shokraiefard


36
implemented the PDCA cycle as a basis for continuous improvement to assess the quality

of the education process for each academic year.

This section reviews some of the most important resources for the employment of

the PDSA Cycle regarding the Health Literacy Universal Precautions Toolkit, 2nd

Edition: Including the Use of PDSA Cycles prepared by Brega et al. (2015) and published

by the Agency of Healthcare Research and Quality, as well as a quality report written by

Cox, Peter, and Young (1999) named “Improving the repeat prescribing process in a busy

general practice. A study using continuous quality improvement methodology.” This

report provides detailed information on utilizing the CQI methodology. Another helpful

resource, a report developed by the National Learning Consortium (2013) titled

“Continuous Quality Improvement (CQI) Strategies to Optimize your Practice,” presents

theoretical detail on using CQI practices to achieve continual quality improvement.

Further literature and resources were investigated to support and provide evidence

in complementary areas of this study and strengthen the methodology section. Gazza

(2015) focused on using the PDSA cycle for enhancing a fresh online course. Squires and

Cloutier (2011) presented a paper on applying the PDSA cycle as a viable process to

develop best learning practices in higher education. The United States Department of

Education (2017) issued the National Education Technology Plan, which urges educators

to execute revolutionary changes including technology-based learning and assessment,

engaging powerful learning content that is interactive, collaborative, visual/dynamic, and

more relevant to life and work.


37
The researcher applies the PDSA cycle to study the effectiveness of blended

learning and provide some recommendations for continuous quality improvements.

Knight and Allen (2012) employed the PDCA cycle as a methodological measure of

students’ learning assessment. In addition, it may be utilized for the development of

courses (Aggarwal & Lynn, 2012; Brown & Marshall, 2008). In fact, the PDSA cycle is

commonly used for initial or renewed accreditation to develop a continuous process of

improvement. However, some educators questioned the application of the Deming’s

Cycle in educational practices (Taylor MJ et al., 2013; Bhuiyan & Baghel, 2005;

Matulich, Papp & Haytko, 2008), while others presumed that there is a similarity between

the development of quality in education and its application in other industrial sectors

(Hughey, 2000; Coates, 2009).

Diffusion of Innovation Theory (DOI)

The DOI theory is one of the most important theories regarding innovation,

especially for HEIs, given the tremendous pressure from emerging education

technologies. Rogers (2003) summarized this theory as “the process by which an

innovation is communicated through certain channels over time among the members of a

social system” (p. 5). From his point of view, innovation is “any idea, practice, or object

that is perceived as new by an individual or other unit of adoption" (Rogers, 2003, p. 5).

According to Rogers (2003), there are four key factors influencing the prevalence of

innovation: the innovation itself, communication channels, rate of adoption, and the

social system. In this case, the DOI represents the implementation of web-based learning

in education, including all forms of education techniques as tech-supported learning,


38
hybrid learning, and distance-learning (Buć & Divjak, 2015). At present, the DOI theory

is commonly used in the educational sector (Almalki, 2011; Lundvell, 2010; Woodside &

Biemans, 2005). The process of developing an interacting learning environment counts

on the organizational leadership, strategic alignment, together with the government part

in controlling and guiding the diffusion of technology in higher education.

Application of the Literature

The review of literature led to the revision of the summary of the main themes

addressed in this study; consequently, several references have been studied to understand

the purpose of the study, answer the problem statement, and support the hypothesis. In

contemporary literature, authors agreed that continuous quality improvements could be

applied to the learning process in higher education; they outline various instructional

strategies and apply a variety of quality models and tools that can be used by instructors

to improve their courses. Deming suggested that “win, win, [is] needed in education”

(1993, p. 152).

While the current literature regarding the application of quality tools and models

is rich in valuable findings, this study is particularly interested in linking the Deming’s

Cycle with the instructional process to sustain quality improvements in post-secondary

education.
39
CHAPTER 3

METHODOLOGY

This section introduces the approach used in this research. It elucidates the

research layout that has been applied to investigate students’ opinions and experiences

with blended learning at REU. It also covers the following components: design of the

investigation, Step 1 of the PDSA, sample selection, instruments used, data collection,

and data analyses.

Design of the Investigation

This research was laid out on a triangulated mixed method propounded by Greene

et al. (1989), which is a combination of two key forms of research, qualitative and

quantitative research (Creswell & Creswell, 2017). The quantitative part uses numbers

and statistical tools (King, Keohane, & Verba, 1994), while the qualitative part seeks

narrative predictions and explanations of certain phenomena that will enhance

understanding through the use of words (Glesne & Peshkin, 1992). Qualitative

researchers are concerned with exploring a particular case, identifying new ideas, and

understanding patterns of behaviour (Hoy & Adams, 2015). Each method tackles the

research in a different way and is composed of various approaches within the framework

of the research. This chapter aims to outline the theoretical framework and the procedures

that were employed to administer this research. The methodology that was chosen to

perform and embrace this study is the PDSA Cycle of Deming (1986). As mentioned

earlier in Chapter 1, the objective of this research was to systematize continuous


40
improvement efforts, particularly for improving the quality of the blended learning

system, identify and analyze its effectiveness, and investigate the system best practices

and trends to improve the teaching/learning process throughout REU. Improvements

require the employment of knowledge, which is theory based. However, theories must be

regularly reviewed and examined by correlating predictions and findings. The PDSA

Cycle aids and expedites this operation. The framework of the research concentrated on

using the PDSA cycle as an essential structure to implement continuous improvements.

Figure 3 illustrates the PDSA framework. The overall study was based on piloting a

blended learning experience at REU, which represents one loop of the cycle.

Figure 3. Model for Quality Improvement. Created by the author of this thesis.

The cycle is a strategy that identifies, analyzes, and evaluates systems with the

end in view of determining a better way of doing things (Miclat, 2005). The model in
41
Figure 3 comprises two parts: first, the thinking part, which includes three basic questions

for attaining improvement (see Table 1) and second, the PDSA cycle. The stages in the

cycle are:

 Stage 1: Plan: the first step is to develop a plan to study and analyze the blended
learning system. What can be done to improve the current system? Determine the
data to be collected.

 Stage 2: Do. The plan will be carried out on a pilot basis through selecting one
course.

 Stage 3: Study or check the data.

 Stage 4: Act on the piloting results.

Table 1

The Thinking Part


Question Methodology
1. What REU aims to  Establish an aim statement.
improve  Establish SMART (“specific, measureable,
attainable, relevant and time-bound”) goals.
2. How to find that change  Set quantitative measures to assess progress
is an improvement towards the aim statement.
3. What kind of changes  Set actions to accomplish the statement.
can be done that will lead
to improvement?
Note. Created by the author of this thesis.

Step 1: Plan

This stage of the PDSA cycle includes outlining objectives, formulating a plan,

and determining measures for plan assessment (The W. Edwards Deming Institute®,

2018). The principal target of this research was to evaluate the effectiveness of the

current blended learning system. A root-cause analysis was developed to help conceive
42
the potential sources of inefficiency of the system. In the education system, the sources of

defects are relatively complicated, as clarified by Figure 4.


43

Figure 4. Cause-and-Effect Diagram. Created by the author of this thesis.

To address the study objective, the researcher selected the blended methods

approach. This approach included quantitative and qualitative data collection. The

quantitative method was formulated through compiling indicators of student performance


44
in the chosen course (retrospective data analysis) and conducting a satisfaction survey for

the blended learning course. While the qualitative method relied on obtaining the

students’ “word of mouth” by undertaking focus group interviews, the author also

performed a review of previous and current literature.

Sample Selection

In 2013, an online learning management platform, Moodle, was made available to

all instructors at REU as a new interactive learning platform. REU started implementing

the blended learning system in the pre-clinical courses, which are offered in the first and

second year. The blended learning model integrates traditional face-to-face class with e-

learning activities that learners able to accomplish individually or collectively through

consulting a specially designed website. In the blended system, instructors at REU

constructed Internet-based courses and sites as additional instructional resources for their

face-to-face lectures. To support the assessment process, it was vitally important that the

collected data with which the study was undertaken exhibited rich content; thus, the

targeted courses were selected according to the rate of access and the usage of the

blended courses platforms. Another important aspect was access to the learning platform,

at least once per semester, whether by teachers or students. To avoid the unfamiliarity

with the learning management system, the researcher sought to exclude first-year

students.

The selection process started by spotting courses and then choosing subjects

relevant to those courses. The first selection process was accomplished through

implementing a multi-stage purposeful sampling, while the second stage was commenced
45
by using a random purposeful sampling approach, in which “the researcher chooses cases

at random from the sampling frame consisting of a purposefully selected sample”

(Onwuegbuzie, Jiao, & Bostick, 2004, p. 126). The next sections explain the mechanism

of these processes.

Course Selection Criteria

To provide powerful data to resolve the research problem, all participants in the

study had experience in the blended learning environment at REU. Therefore, the target

group for this research was exclusively those utilizing blended learning in their

instruction process. More precisely, instructors who had accessed the learning platform at

least once per week from the REU portal were included. From this population, courses

offered for students in the third to sixth year were excluded, as these courses contain

clinical hours as part of the total course credit hours. Thus, the courses in the second year

underwent another stage of the multi-stage purposeful sampling process. In the second

year of study, a total of 10 courses are offered. The researcher excluded six courses

because they were completely offered using the conventional face-to-face system or their

sites were still being constructed, whereas the other four courses were identified as

blended learning courses. The interactive educational sites were diversified by

educational content, design, and objectives. The researcher designed an evaluation

checklist for each course website to investigate the content and configuration. The main

goal of the checklist was to identify the study participants. Therefore, the evaluation

criterion was limited to items that are relevant to the study objectives.
46
The findings of Chao, Saj, and Tessier (2006), Khan (2005), Hwang, Huang, and

Tseng (2004), and El-Tigi (2001) were revised to finalize the main aspects to be included

in the evaluation checklist. Further findings deduced that accessibility, content, design,

interactivity features, and communication were the most common features for evaluating

an online learning environment (Boklaschuk & Caisse, 2001; El-Tigi, 2001; McClue,

Esmail, & Eargle, 2006). Twenty-one closed (Yes/No) questions were recognized for

assessing the quality of course sites. In addition, the researcher sought to formulate the

checklist in a clear and categorical manner to enhance the reliability of the results

(Bichelmeyer, 2003; Chao, Saj, & Tessier 2006).

There were four potential course sites identified on REU’s portal and after

comparison against the evaluation criteria, only one course was identified to be included

in this research. The criteria of the selected course site are shown in Table 2.
47
Table 2

Course Site Evaluation Criteria


Evaluation Criteria S1 S2 S3 S4 S5 S6 S7 S8 S9 S10 S11 %
Website Accessibility
Instructor Access once
per Week √ √ √ √ √ √ √ √ √ √ √ 100
Students Access at least
once per week √ √ √ √ √ √ √ √ √ √ √ 100
Minimal Loading Time √ √ √ √ √ √ √ √ √ 82
Clear Navigation √ √ √ √ √ √ √ √ √ √ √ 100
Interactivity Features
Audio &Video √ √ 18
Chat √ 9
Discussion Board √ √ √ √ √ √ √ √ √ √ √ 100
Quizzes √ √ √ √ √ √ √ √ 66
Survey √ √ √ √ √ √ √ √ √ √ √ 100
Content
Recently updated (once a
week) √ √ √ √ √ √ √ √ √ √ √ 100
Course Syllabus √ √ √ √ √ √ √ √ √ √ √ 100
Classes (Day and time) √ √ √ √ √ √ √ √ 72
Course Description &
objectives √ √ √ √ √ √ √ √ √ √ √ 100
Course Requirement √ √ √ √ √ √ √ 64
Course Calendar √ √ √ √ √ √ √ √ √ √ 90
Grading System √ √ √ √ √ √ √ √ √ √ √ 100
Exercises √ √ √ √ √ 45
Lecture Notes &
Presentation √ √ √ √ √ √ √ √ √ √ 90
Exercises √ √ √ √ √ 45
Communication
Instructor Contact Detail √ √ √ √ √ √ √ √ √ √ √ 100
E-mail √ √ √ √ √ √ √ √ √ √ √ 100
Note. S= course site. Created by the author of this thesis.

Table 2 indicates that the accessibility to the site by the instructor and students

was very good. In addition, the site included several help features and was easy to

navigate. Some adds-on of the Moodle learning platform were not activated by the
48
system administrator. The site was well designed using several interactivity features. As

well, the contents of the site were constantly revised and modified. While the other three

course sites were limited in many aspects, such as site accessibility by the instructors

themselves (once per 11 weeks), most of the interactivity features were deactivated and

most of the content was outdated. Therefore, the researcher excluded the other three

courses from participating in this study. Accordingly, the researcher conducted the

experiment with second-year undergraduate students in the BICH 221 (Biochemistry

221) course offered by the Preparatory Health Studies (PHS) Department.

To address the first and second hypotheses, the inclusion criteria were based on

courses that were delivered by the same tutor since the era before the implementation of

blended learning (before 2013). Only one course was determined as a match for this

research criterion, BICH 221.

Student Recruitment Criteria

The researcher started the selection process by identifying the courses through

which to invite students to enroll in this study, designing criteria for student enrollment to

provide robust data for the study phase of the PDSA Cycle. The criteria were:

 Students who were registered in the BICH 221 for the academic year 2016-17;

 Students above the age of 18;

 Students who were effectively engaged in the selected course, evident by the
history log in Moodle. To be included, the student must have accessed the
course site at least once per academic semester.

In this research, the course sites were examined against an evaluation checklist to

determine the one ideal course site, which, in turn, led to communicating with the course
49
instructor through a letter of invitation in order to gain the instructor’s cooperation. The

278 students associated with the selected course represented the study sample.

Stage 2: Do

Instrument and Data Collection

The Do stage of the PDSA cycle included execution of the plan, recoding data,

and observation documentation (The W. Edwards Deming Institute®, 2018). To address

the research objectives and hypotheses, the researcher utilized the following instruments.

Self-administered Survey. In empirical research, the survey is a fundamental

mode of collecting and examining data in many social sciences areas and their applied

fields within a short period of time (Rossi, Wright, & Anderson, 2013; Creswell &

Creswell, 2017). This tool was used to address the satisfaction aspect of effectiveness of

blended learning and to test Hypotheses 3 and 4. Data were collected using commonly

accepted practices for the quantitative research method, a written anonymous survey

instrument. A well-designed and well-managed survey acts to reduce bias (Ornstein,

2013), while the non-disclosure of the identities of participants may induce more honest

responses (Birmingham & Wilkinson, 2003; Riffenburgh, 2012). Further, Johnson and

Christensen (2008) described the survey as a time-efficient instrument, which can be used

to collect quantitative data and not limited to one research method.

A self-administered survey was used to study a sample of REU undergraduates

and to solicit their views regarding the selected course. The survey was also helpful for

collecting demographic data of the sample population, such as age and gender. The

researcher constructed the survey in a way to match the objectives. Gillham points out,
50
“The logical starting point for developing a questionnaire is to ask what your broad aims

are” (Gillham, 2008, p. 15). The survey examined the students’ opinions and experiences

in a blended learning environment at REU. To gather and measure data that answered the

research questions, a self-administered survey was adopted from several studies in

literature. The survey instrument was constructed based on a student questionnaire

adapted from Long Island University (Long Island University, n.d.), which was pre-tested

for validity and reliability (Calderon, Ginsberg, & Ciabocchi, 2012). The original

questionnaire was comprised of three sections: Section A collects demographic data,

while Section B is composed of 26 items (Likert-scale) about students’ perceptions

regarding their blended course. The items covered the quality of the course in general and

the learning usefulness of the course components. Section C includes three open-ended

questions asking students to express their overall opinions and experiences with the

blended course.

Johnson and Christensen (2008) assumed that respondents would provide truthful

answers if the survey was clear, unambiguous, and short. The assumptions of validity and

reliability are essential tools of the scientific method (Mertens, 2014; Johnson &

Christensen, 2008; Almalki, 2011).

Students were asked to fill in a three-page survey (see Appendix A). The

researcher distributed 308 hard copies of the questionnaire. In total, 216 completed copies

were retrieved and submitted for analysis (response rate > 70%). The estimated time

burden to fill in the survey form was 15-20 minutes, based on a pilot study with 10

students. The survey was composed of four parts covering demographic data, students’
51
perceptions of blended learning, open-ended questions, and domains that influence the

application of e-learning at REU. These sections were constructed based on the blended

learning model component and the Khan’s octagonal framework. Blended models

provide several themes in relation to the blended learning environment, such as

interaction, environment, usefulness, technology, course management, and instruction.

The first section collected the participant’s gender and age. Other personal data

such as students’ IDs were not included. Accordingly, the collected data were analyzed

and correlated with other factors to address the third hypothesis.

The second, third, and fourth sections explored students’ experience with a

blended learning environment. These sections covered the following five domains based

on the Long Island University survey (n.d.): “instructor, technology, class management,

interaction, and instruction.” Each participant was asked to complete about 37 multi-part

questions on a 5-point Likert scale. The second section explored students’ satisfaction

regarding their experiences with the usefulness of the course site. The usefulness scale is

examined through the following items: accessibility, productivity, course satisfaction,

communications, and collaboration. A 5-point rating scale was employed for this section,

with answers ranging from 1 “very dissatisfied”; 2 “generally dissatisfied”; 3 “neither”; 4

“generally satisfied” to 5 “very satisfied.”

Section 3 contains the bulk of the items related to students’ learning experiences,

learning resources, engagement, and the accessibility rate of the course site by the

instructor and students. For addressing section three, a 5-point Likert scale was adopted,
52
in which 1 symbolized “strongly disagree”; 2 “disagree”; 3 “neutral”; 4 “agree”; and 5

“strongly agree.”

The fourth section measured the degree of interactions within the course site,

whether student-instructor interaction or student-student interaction. In addition, a five-

point scale was applied using: 1, “much worse”; 2 “a little worse”; 3 “about the same”; 4

“a little better”; and 5 “much better.”

The last section was comprised of three optional open-ended questions, aimed to

elicit the students’ opinions regarding their experiences with blended learning. These

were formulated to investigate the pros and cons of the digital platform and the

suggestions for continuous improvement of the blended learning system at REU. Other

items were related to the students’ opinions on their preferred learning models.

Survey Validity and Reliability

Validity and reliability must be assessed with respect to the quality of the data

collected using secondary data sources. In this case, the data collection instrument is the

survey (Johnson & Christensen, 2008; Colton & Covert, 2013). A data collection tool is

considered valid if it measures what it intended to measure. While reliability is a critical

prerequisite for valid measurement, it is not enough on its own for validity. Therefore,

reliability is defined as the degree of consistency of the output of repeated

administrations under similar conditions (Ercan & Kan, 2004; Mertens, 2014; Bryman,

2016; Colton & Covert, 2013; Howell et al., 2013).


53
To guarantee content validity, 15 participants were asked to complete the survey.

A Cronbach’s alpha of 0.932 for internal consistency reliability indicated an excellent

internal consistency for the 36 items of the questionnaire.

Recruitment Methods

Students were part of this research if they voluntarily agreed to complete the

survey form. The course director of each selected course was contacted and he or she

announced the opening of the study recruitment to the students enrolled in that course.

The course director contacted students through their college e-mails regarding how and

when a willing volunteer can participate in the survey. The research investigator, in

collaboration with the course director of each course, made an initial contact with

subjects and explained the intent of the study. Informed consent was given in writing (by

hand), with subjects signing the Consent Form (see attached Consent Form in Appendix

B).

Retrospective Cohort Analysis

A retrospective cohort analysis was conducted to collect and assess course

outcomes (course satisfaction and learning achievements) from historical records. Cohort

studies have a timeframe for the assessment of causality; therefore, they have the ability

to provide vigorous scientific evidence (Everitt & Palmer, 2005). The data were retrieved

from the Quality Assurance Centre database relating to the annual course reports and the

findings of the Course Evaluation Survey (CES). This study collected data corresponding

to the pre-blended learning experience (from 2008 to 2013) and the period in which

blended learning was activated (2013 to 2017). The experience of blended learning in the
54
BICH 221 course was assessed from several angles, primarily satisfaction and course

outcomes. To answer Hypothesis 1, the researcher used student satisfaction. To test the

second hypothesis, the evolution of course outcomes was analysed using the pass/fail

rates and drop rates. The distribution of grades (A, B, C, D, and F) are reasonable

indicators of learning outcomes as well.

In this respect, the researcher collected the following variables to examine

students’ experiences with the blended course and to answer Hypothesis 2:

 Year: the year was recorded to track periods before and after the
implementation of the blended learning system at REU.

 Application of blended learning (binary variable) was documented as yes/no.

 Student satisfaction is a key measure used by REU to gather information


about students’ experiences with their courses. The Course Evaluation Survey
(CES) is conducted on a regular basis. The researcher collected students’
satisfaction data regarding the BICH221 course from 2008 to 2017.

 Students’ passing rate was calculated as the number of student who passed/
total number of registered students per academic year (AY). In addition, the
percentage of students in each of the following grades per year was
documented:

o A: total number of A students per year/total number of registered


students in the same year.

o B: total number of B students per year/total number of registered


students in the same year.

o C: total number of C students per year/total number of registered


students in the same year.

o D: total number of D students per year/total number of registered


students in the same year.
55
 Failure rate: number of students who failed in the course/total number of
registered students per AY.

 Drop rate: number of students who did not complete the course/total number
of registered students per AY.

Focus Group

A focus group is “a special type of group in terms of purpose, size, composition

and procedures” (Krueger & Casey, 2014, p. 4). The intent of conducting a focus group is

to obtain an in-depth grasp of how people think and feel about a service or an issue

(Krueger & Casey, 2014). Participants were selected because they all had experiences

with blended learning. The researcher created a permissive environment to encourage

students to share their perceptions and opinions. This was followed by a systematic

analysis of the responses (Teddlie & Tashakkori, 2009), which was presumed to provide

insights as to how blended learning is perceived by students in BICH 221. This

instrument was used to collect qualitative data, guide system development, and ensure

quantitative results consistency. In most interviews, researchers present several open-

ended questions to participants regarding the subject and record their responses (Creswell

& Creswell, 2017).

The interview included the following questions:

 Tell me about your experience with blended learning system.

 In your opinion, what are the advantages of such system?

 What are the disadvantages?

 Which instruction modality do you prefer?


56
An invitation letter was forwarded to students registered in BICH 221 for the AY

2016-17 via e-mail. The formal email included information about the research, the

interview date, the researcher, confirmation of confidentiality, and the aim of this study.

Eight students participated in an interview that lasted for an hour. The researcher

requested the students to fill a consent form stating their acceptance to participate in the

study.

The validity of qualitative data depends on the techniques that are used to collect

research information and to set the assembly tool (Minichiello et al. 2008). A debate is

existing in literature regarding how to judge the validity of qualitative data. There are

several techniques employed to double-check the qualitative output validity, such as peer-

debriefing and triangulation (Creswell & Poth, 2016). To validate the qualitative data of

this study, a peer debriefing process was adopted. Creswell and Miller (2000) suggested

selecting an external person “who is familiar with the research or the phenomenon being

explored” (p. 129). For this validation technique the course director was involved as

reviewer. The researcher and the instructor reviewed the interview questions, the

responses, and the data analysis as a peer debriefing procedure.

For qualitative data analysis, the researcher used the thematic analysis method.

Thematic analysis is a widely adopted method for analysing qualitative data that focuses

on devising meaningful patterns within a data set (Minichiello et al., 2008). Boyatzis

(1998) defined it as “the process for encoding qualitative information” (p. vi). Patterns

can be identified by following an accurate process of data definition and familiarization,

then data coding, and finally, themes development. The process of coding has been
57
interpreted as “a process of identifying aspects of the data that relate to your research

question” (Braun & Clarke, 2013, p. 206). The raw input were investigated based on

meaningful features and characteristics. Themes related to students’ experiences with the

blended course are characteristics, advantages, and disadvantages, as displayed in Table

3. Each theme considered several categories for further examination.


58
Table 3

Themes and Codes of the Thematic Analysis


Theme Category Sub-category
Website Content Lectures (summary/notes; presentation; audio/video
characteristics recorded)
Tests/exams (exam samples; electronic tests)
Syllabus (objectives, requirements, schedule)
External references (related websites; e-books; video clips)
Design (clarity/attraction)
Communication / Announcements (exams; activities; results;
interaction absent/attendance rate)
Contribution (course materials; commenting)
Feedback (answer enquiry; instructor‘s contact details)
Course Provide online submission feature
administration Routine maintenance
Offer technical support
Website Learning Students’ learning outcomes/achievements (exam and test
advantages results)
Enhanced course delivery
Communication/Interactions
Time
Teaching Facilitates understanding
Enhance interaction/online discussion
Preparation before lectures
External references
Promotes learning
Communication Enhancement of communications
Increase communication flexibility
Skills
Website Course site Lectures summary & slides
disadvantages management Discussion forums
/administration Ability to comment
and design Technical issues
Poor structure
Students Online extra activities
Technical issues Internet service at home
Note. Reprinted from Blended learning in higher education in Saudi Arabia, by Almalki,
(2011), retrieved from https://researchbank.rmit.edu.au/eserv/rmit:14613/Almalki.pdf
Copyright 2017 by RMIT University. Obtained with permission.

Study Location

Fieldwork was conducted in REU. Formal permission was required from the

ethical committee of the college. The Institutional Review Board (IRB) approval can be

found in Appendix C.
59
Data Analyses

Researchers considered the data analysis phase as the most significant, since this

phase involves transforming the raw data into valuable information using the appropriate

analytical procedures and statistical tests to address the research problem (Creswell &

Creswell, 2017). Descriptive statistics including averages, medians, standard deviations,

ranges, percentages, and distributions, were applied for quantitative data (e.g., age).

Concerning the qualitative data, a thematic analysis process was employed for data

treatment. The Shapiro-Wilk test of normality was run on all scale variables. Non-

parametric tests were implemented whenever the variables were skewed. Moreover, the

Mann-Whitney U test was also applied to compare the values of a quantitative variable in

relation to a categorical variable with two categories. The Spearman’s statistics was used

to measure the rank-order correlation between two scale or ordinal variables. In addition,

linear regression was implemented to examine the linear relationship between an

outcome (dependant) variable and the predictor variables. The IBM® SPSS® 22

(Statistical Package for the Social Sciences, Version 22) was utilized for data analysis.
60
CHAPTER 4

RESULTS AND DISCUSSION

Stage 3: Study/Check

This chapter discusses the study phase of the PDSA cycle and presents the most

important results achieved based on the statistical treatments conducted in light of the

data collected and analyzed through the tools of the study.

This research aimed at examining the opinions and experiences of REU

undergraduate students. Specifically, the study scrutinized the effectiveness of the

blended learning system at REU. In addition, it involved the assessment of a blended-

learning platform that has been utilized to deliver part of the course content online. The

study also examined the components that may influence the implementation of online

learning at REU. This study was conducted to reply to the query: How did undergraduate

students at REU perceive the application of blended learning? In order to provide a

solution for this query, the investigator examined the content to identify the trends related

to the blended learning system.

In order to improve the data analysis, a mixed technique of research was used

(Creswell & Creswell, 2017; Creswell & Plano Clark, 2007; Doyle, Brady, & Byrne,

2009). The researcher also intended to enhance validity by linking qualitative and

quantitative data (Creswell & Creswell, 2017; Creswell & Tashakkori, 2007; Teddlie &

Tashakkori, 2009). This chapter introduces the results of the surveys and the interviews,

the analyses of the findings, and the discussion of the results.


61
One course was selected under the conditions of the selection criteria as the

sample for this study. Face-to-face interviews included nine students, surveys were

distributed to undergraduate students who were actively involved in BICH 221, and a

retrospective cohort analysis was performed on the same course.

This chapter covers three sections of the analysis stage of this research. The first

is related to the self-administered survey analyses, the second section analyzes the

retrospective data, and the last section presents the analysis and the discussion of the

qualitative data.

Self-Administered Survey Analyses

This section consists of three main subsections, based upon the plan of the survey.

The first part illustrates the respondents’ demographics. The second part investigates the

students’ opinions concerning the blended course site, developed by examining the

results of each item of the survey, studying students’ feedback of the main components of

the online part, and conducting regression analysis of the survey items. The final part

covers the open-ended questions.

Demographic Data

The researcher distributed 308 hard copies of the survey. A total of 216 completed

copies were retrieved and submitted for analysis (response rate > 70%). All participants

completed the BICH 221 course in the academic year 2016-2017. Approximately two-

thirds of the participants were females (65.3%, n=141), and 34.7% (n=75) were males

(Figure 5). The mean (±SD) age of the participants was 20.38 years (±1.14), ranging from

18-25 years. The mean (±SD) age of the female participants was 20.88 years (±1.10),
62
ranging from 20-25 years and the mean (±SD) age of the male participants was 20.11

years (±1.10), ranging from 18-23 years.

Figure 5. The percentages of male and female respondents.

The predominance of female participants was an expected outcome because more

females registered for the course than male, with a ratio of approximately 2:1.

Survey Item by Item Analysis

The research was reinforced by the survey’s answers, which reflected the

students’ experience with the blended course. The survey’s questions were intended to

examine the overall students’ perceptions regarding the blended course by investigating

their individual experiences with the value placed by the students on the course site as a

tool for learning and communication. The author also explored the students’ opinions on

the instructor-learner interaction and the student-to-student interaction. In addition, the

individual learning experience was inspected in this study.


63
Firstly, the students responded about their satisfaction with the blended learning

course (Q1) as follows: 20.8% very satisfied, 56.9% generally satisfied, 16.7% neither,

5.6% generally dissatisfied, and 0% very dissatisfied. This is an indication that the most

of them, approximately 78%, were satisfied (either generally satisfied or very satisfied)

with their experience with blended learning. This goes in an agreement with the findings

reported by Bailey (2002), Futch, Dziuban, and Charles (2005), Alebaikan (2010), Julio,

María, and Angel (2010), Wu and Liu (2013), and Abou-Naaj et al. (2012).

The fourth section of the survey consisted of nine items (Q 2-10) that investigated the

scale of interaction and collaboration in a blended learning environment compared to

traditional courses, whether student-instructor interaction or student-student interaction.

A five-point scale was applied to cover this domain using 1 to represent “much worse,” 2

“a little worse,” 3 “about the same,” 4 “a little better,” and 5 “much better” (see Table 4).
64
Table 4

Survey Results for Items 2-10


Frequency
(Percent)
1 2 3 4 5
Much A About A Much
Q Question text* worse little the little better Mdn
worse same better
“The amount of your interaction 3 9 57 96 51
2
with other students” (1.4) (4.2) (26.4) (44.4) (23.6) 4
“The quality of your interaction 0 0 63 96 57
3
with other students” (0.0) (0.0) (29.2) (44.4) (26.4) 4
“The amount of your interaction 6 6 39 87 78
4
with the instructor” (2.8) (2.8) (18.1) (40.3) (36.1) 4
“The quality of your interaction 6 6 42 78 84
5
with the instructor” (2.8) (2.8) (19.4) (36.1) (38.9) 4
“A blended learning course makes 6 12 48 84 66
it more important for students to (2.8) (5.6) (22.2) (38.9) (30.6) 4
6
visit the lecturer during office
hours”
“I am satisfied with the way I 3 6 42 96 69
7
interact with other students” (1.4) (2.8) (19.4) (44.4) (31.9) 4
“I am satisfied with my 0 9 30 108 69
8
participation in the course” (0.0) (4.2) (13.9) (50.0) (31.9) 4
“I am satisfied with the process of 6 15 57 78 60
9 collaboration activities during the (2.8) (6.9) (26.4) (36.1) (27.8) 4
course”
“Generally, I am more engaged in 6 9 54 72 75
10
my blended courses” (2.8) (4.2) (25.0) (33.3) (34.7) 4
*
Note. Adapted from Long Island University survey (n.d.). The author used a five-point
scale adopted from Iowa-State University (2010). Mdn= median.

As presented in Table 4, the implementation of the e-learning platform in the

course BICH 221 made a remarkable contribution to improving the students’ perception

of interaction. When students were asked about their views concerning the amount and

quality of the interaction and collaboration in the course compared to traditional courses,

the majority of participants, ranging from 36% to 50%, responded by “a little better” for

Questions 2, 3, 4, 6, 7, 8, and 9. The highest percentage was for Item 8 “I am satisfied


65
with my participation in the course.” For Questions 5 and 10, the majority of responses

were “much better.” The response to Item 8 showed that nearly 82% of the respondents

agreed positively with their level of participation in the blended course. This finding

reflected the ability of the course’s digital platform to help students improve their course

participation and to facilitate the communication process. These questions review the

benefits of the blended learning system. Some of the benefits are in line with the

literature in this consideration. In addition, the literature review in Chapter 2

demonstrates that advantages include cost effectiveness, flexibility, and accessibility.

Notably, for Question 9 “I am satisfied with the process of collaboration activities

during the course,” approximately 10% showed a negative response. This result can be

attributed to the students’ deficiency of training in the various activities provided by

Moodle. In addition, it can refer to the fact that students had difficulties accessing and

dealing with the digital platform (e.g., intermittent Internet connection).

The general positive response to the level of “interaction and collaboration” in the

blended course is in accordance with the findings of Almalki (2011), in which the general

evaluation of the interaction domain was highly positive.

When examining other items, Table 5 illustrates the findings concerning the

students’ perception of the level of effort and performance required by this course.

“Generally satisfied” was reported by the majority of students on the level of effort the

course required, their performance in the course, and their learning experience in

comparison to face-to-face setting.


66
Table 5

Questions 20, 21, and 22


Frequency
(Percent)
1 2 3 4 5
Very Generally Generally Very
Q Question text* dissatisfied dissatisfied Neither satisfied satisfied Mdn

“I am satisfied
with the level of 6 12 36 99 63 4
20
effort this course (2.8) (5.6) (16.7) (45.8) (29.2)
required”
“I am satisfied
with my 6 12 57 90 51 4
21
performance in (2.8) (5.6) (26.4) (41.7) (23.6)
this course”
“Compared to
face-to-face 9 18 33 90 66 4
course setting, I (4.2) (8.3) (15.3) (41.7) (30.6)
22
am satisfied with
this learning
experience”
Note. *Adapted from Long Island University survey (n.d.). Mdn= median.

As presented in Table 5, the majority of students (75%) expressed their

satisfaction with the level the workload required by this course. This result showed that

the blended course enables learners to manage their time more properly, which improves

study flexibility. In addition, it improves the accessibility to course materials anytime

anywhere. This result also indicates that the students found it easy to deal with the online

portion of the course, as the perception about the level of effort needed in a system is

reflected by how much it is easy to handle (Davis & Wong, 2007). The accessibility to

course materials online improves the comfort of students’ learning. In addition, it

provides students with more time to read, understand, and accomplish their homework
67
and assignments. Moreover, it provides an opportunity for absent students to familiarize

themselves with the course material (Almalki, 2011).

Since the amount of effort and student academic performance are related, the

results of Question 21 were in alignment with the outcomes of the previous question. The

researcher attributed this result to the implementation of blended learning, an

environment allows the instructor to communicate easily with students. He/she can also

provide them with the latest updates and online activities, which can save a lot of their

time. Furthermore, students have the ability to access course materials and do and submit

their quizzes and assignments at any time and from anywhere. In the context of such

information, Rosset, Douglis, and Frazee (2003), Thomas et al. (2005), Chen et al.

(2007), and Jee and O’Connor (2014) unanimously agreed upon the positive influence of

blended learning on the performance of learners.

The researcher attributed this result to the fact that the blended method of

education provided the students with the opportunity to accomplish part of the

educational activities through the online portal, which assists the learners to improve their

performance and obtain a good learning experience. As well, the course site can help

students in managing their time properly. Students had slightly divided opinions

concerning the item “Compared to face-to-face course setting, I am satisfied with this

learning experience,” in which approximately 13% responded negatively and

approximately 71% responded positively. This reflects that a small percentage of students

still favor the face-to-face course environment. At REU, blended learning is applied

through delivering the usual credit hours of the course via face-to-face instruction while
68
pursuing additional learning activities online (e.g., assignments, quizzes, and videos).

Table 6 includes nine items (Q 11-19) related to the course site’s usefulness.

Table 6

Survey Results for Items 11-19


Frequency
(Percent)
1 2 3 4 5
Strongly Disagree Neutral Agree Strongly
Q Question text* disagree agree Mdn
“I am more likely to ask 6 9 54 96 51 4
11 questions in a blended (2.8) (4.2) (25.0) (44.4) (23.6)
learning course”
“There are more 3 9 66 96 42 4
opportunities to collaborate (1.4) (4.2) (30.6) (44.4) (19.4)
12
with others in a blended
course”
“My blended course 6 6 72 78 54 4
experience has increased my (2.8) (2.8) (33.3) (36.1) (25.0)
13
opportunity to access and use
information”
“Blended learning helps me 6 9 75 75 51 4
14 better understand course (2.8) (4.2) (34.7) (34.7) (23.6)
material”
“The use of blended learning 6 6 66 90 48 4
technology in this course (2.8) (2.8) (30.6) (41.7) (22.2)
15
encourages me to learn
independently”
“My understanding is 3 21 45 96 51 4
improved compared to (1.4) (9.7) (20.8) (44.4) (23.6)
16
similar face-to-face courses I
studied before”
“My performance in exams 6 9 66 81 54 4
is improved compared to (2.8) (4.2) (30.6) (37.5) (25.0)
17
similar courses I studied
before”
“I enjoy working on 21 15 51 90 39 4
18 assignments by myself” (9.7) (6.9) (23.6) (41.7) (18.1)
“Generally, I understand 6 9 66 78 57 4
19 course requirements better in (2.8) (4.2) (30.6) (36.1) (26.4)
a blended learning course”
Note. *Adapted from Long Island University survey (n.d.). Mdn= median.
69
As shown in Table 6, the majority of respondents, ranging from 34.7% to 44.4%,

answered with “Agree” for questions 11-19. In fact, there was a consensus among

students that the blended course enhanced their ability to ask questions, collaborate with

other students, access course materials easily, develop their understanding of the content

of the course, work independently, improve their performance in exams, and increase

their understanding of the curriculum requirements. Generally, most of the responses

were in agreement with the positive effect of the blended course on their learning

experiences. This corresponds with the findings of Waha and Davis (2014), who

concluded that students are generally inclined to use independent and individual learning

for doing part of the course activities. They also inferred that 78% of the students

believed that the course site is an efficient and a dynamic way of collaborating and

engaging with their instructor and peers. While others (17%) reported that they did not

enjoy doing and submitting their assignments through the course site, this may be

attributed to the lack of experience for some students in dealing with the online tools

available in Moodle.

Other items of the survey covered the instructor, instruction, and technology

domain. The majority of respondents, ranging from 37.5 to 56.9%, answered with

“Agree” for Questions 23-33 (Table 7).


70
Table 7

Survey Results for Items 23 to 33


Frequency
(Percent)
1 2 3 4 5
Strongly Disagree Neutral Agree Strongly
Q Question text* disagree agree Median

“The instructor made me 3 18 48 81 66 4


23 feel that I am a true (1.4) (8.3) (22.2) (37.5) (30.6)
member of the class”
“The instructor used 3 12 33 99 69 4
24 Moodle technology (1.4) (5.6) (15.3) (45.8) (31.9)
appropriately”
“The instructor clearly 3 3 30 96 84 4
communicated important (1.4) (1.4) (13.9) (44.4) (38.9)
25
due dates/times frame for
the activities”
“The instructor provided 3 9 21 123 60 4
clear instructions on how (1.4) (4.2) (9.7) (56.9) (27.8)
26
to participate in the course
activities”
“The instructor encouraged 3 6 63 90 54 4
27 students to explore new (1.4) (2.8) (29.2) (41.7) (25.0)
concepts in the course”
“Feedback on evaluation 0 15 33 120 48 4
of tests and other (0.0) (6.9) (15.3) (55.6) (22.2)
28
assignments was given in a
timely manner”
“I am satisfied with the 3 3 48 108 54 4
accessibility and (1.4) (1.4) (22.2) (50.0) (25.00
29
availability of the
instructor”
“I attend all the activities 6 12 33 90 75 4
offered through Moodle (2.8) (5.6) (15.3) (41.7) (34.7)
30 (Quiz, assignment, Videos,
etc.) the same way I attend
face-to-face classes”
“Technical problems are 6 24 33 84 69 4
not frequent and they do (2.8) (11.1) (15.3) (38.9) (31.9)
31 not adversely affect my
understanding of the
course”
“The Technology used for 3 6 39 123 45 4
32 blended teaching (Moodle) (1.4) (2.8) (18.1) (56.9) (20.8)
is reliable”
“Course content shown or 3 9 48 117 39 4
33 displayed using Moodle is (1.4) (4.2) (22.2) (54.2) (18.1)
clear”
Note. *Adapted from Long Island University survey (n.d.). Mdn= median.
71
Table 7 revealed that more than 70% of students were satisfied with the level of

interaction and collaboration with the course instructor in the blended course. Two items,

Questions 25 and 26, surpassed the other items. This showed that the course director had

a prominent role in designing the digital platform. It also illustrates that the teacher's

timely response and continued engagement with students are important factors in

students’ satisfaction. This is in line with what has already been concluded by Dziuban, et

al. (2013), Arbaugh et al. (2008), and Akyol, Vaughan, and Garrison (2011). Specifically,

the online portion of the blended course requires tutors to dedicate a considerable amount

of time to communicate and engage with students (Cho & Tobias, 2016; Arbaugh et al.,

2008; Davidson-Shivers, 2009; Garrison & Arbaugh, 2007). The findings in Table 7

confirm the significance of the instructor’s role in blended learning environments (Cho &

Kim, 2013; Hew, Cheung, & Ng, 2010; Gedik, Kiraz, & Ozden, 2013).

Although the online learning activities and tools provided in the course were

generally accepted, there was a concern with regard to Question 31, in which 14% of

students revealed that they had encountered technical issues while using the course site.

An interpretation is that the students themselves were not technically competent enough

to deal with the course's electronic content or the computer hardware. In addition, a

sluggish Internet connection or low bandwidth may have restricted students in keeping up

with their virtual colleagues. Many researchers and professionals stressed the importance

of designing an effective course site. In the online learning part, the student's learning

experience takes place through using technology; accordingly, designing the digital

learning platform is substantially critical. In an online article by Marry Burns, she stated,
72
“poorly designed technology-based courses can confound learning, frustrate learners and

instructors, and result in high attrition rates” (Burns, 2016). Having technical problems

could negatively influence the students’ learning experience (Wang & Huang, 2018;

Waha & Davis, 2014).

The researcher deepened the investigation, trying to understand the students’

perceptions of online learning tools. When participants were asked to describe their

experience with the tools in Moodle (lesson files, online tasks, forums, quizzes, and

surveys), the majority reported that they found these tools easy to use and useful. By

contrast, 6% of the students were convinced that these tools were not useful to them,

whereas 22% of respondents mentioned that not all the tools were utilized in their

blended course. The results obtained in Table 8 led to the conclusion that today’s students

are exceptionally tech-savvy and they appreciated the convenience and freedom of using

such tools. The findings also indicate that the course site is well designed and

commensurate with the level of most of the students' skills. This result can be supported

by the inferences of Kvavik and Caruso (2005), in which they asserted that 93% of post-

secondary students have a computer or laptop and most students described themselves as

highly skilled in using computer applications and the Internet. Additionally, most of the

college students have high-speed Internet access (Levin & Arafeh, 2002). Several factors

affect students’ perceptions of online learning tools, which transcend the individual skills.

A study on higher education found a connection between online learning tools and user

behaviour (Martinez-Torres et al., 2008). This leads us to believe that students’ resistance

to the introduction of technology into education is a sensitive factor in e-learning,


73
regardless of the potential gains of integrating technology in education. Another factor is

instructor competencies with using e-learning tools that are crucial to develop a

successful blended course (Johnson et al., 2012).


74
Table 8

Multiple Choice Questions


Question text Frequency Mediana
(Percent) Modeb
1 Not all tools were used for our 48 2b
course (22.2)
2 I find this tool easy to use 87
How would you describe
(40.3)
your experience on the tools
3 I find this tool too hard to use 12
in Moodle (Lesson files,
(5.6)
online tasks, Forums,
4 I find this tool useful 57
Quizzes, surveys)?
(26.4)
5 I do not find this tool useful 12
(5.6)
1 Too light 9 3a
(4.2)
2 Light 54
(25.0)
Compared to your other
3 Moderate 129
courses, was the workload in
(59.7)
this blended learning course?
4 Heavy 9
(4.2)
5 Too heavy 15
(6.9)
1 Entirely Face-to-Face 99 1b
(45.8)
2 Minimal use of the web, 30
mostly held in face-to-face (13.9)
format
Which class modality do you 3 An equal mix of face-to-face 63
prefer? and web content (29.2)
4 Extensive use of the web, but 15
still some face-to-face class (6.9)
time
5 Entirely online 9
(4.2)
Note. Developed by the author of this thesis.

In the literature, some research condemned the consideration of the ease-of-use of

the e-learning tools as an influence on student satisfaction. Therefore, other reasons of

higher importance were determined to judge the efficiency of the blended learning
75
experience, namely: the implementation of the tool, the lack of awareness, the quality of

communication (speed and consistency), and the course organization (Armstrong, 2011).

Many studies deal with how to design an effective online learning environment,

which in turn stimulates and motivates the student to participate, interact, and gain new

knowledge. In the blended course, the course director is responsible to establish a balance

between the conventional class teaching and the web-based instruction (Gedik, Kiraz, &

Ozden, 2013), and to link the online activities and tools with the course objectives and

the learning outcomes (NCAAA, 2009).

Compared to their other courses, the majority stated that the workload in this

blended learning course was moderate. Although many of the participants expressed their

satisfaction with the blended course, approximately 46% admitted that they still prefer

the face-to-face instructional modality (Table 8).

The result of the question “Which class modality do you prefer?” encompasses

substantial opposition for the enforcement of blended learning. With reference to the

results presented in Table 8, most students still preferred the traditional learning system.

This claim may be caused by the poor design of the blended course, or it could be just a

matter of students’ resistance to ICT. This result is compatible with the finding reported

by Lopez-Perez et al. (2011), in which higher scores were recorded for the FTF modality.

The answer to the direct question of which modality is preferred raised a concern that this

question may not reflect the overall satisfaction with blended learning, and in fact may

contradict it.
76
Domains Analyses

The blended course survey had a group of items regarding the contents of the

course site. The survey investigated the students’ perception of the main components of

the self-directed online portion of the intended course. These components are closely

related to one another and possess an impact on the student satisfaction, as documented in

the literature (Kintu, Zhu, & Kagambe, 2017).

Regression Analysis of Survey Items. In this analysis, the results of the multiple

linear regression models examining satisfaction rate per respondent are summarized in

Tables 9 and 10. Interaction domain, instruction domain, instructor domain, course

management domain, and technology domain were significantly associated with an

increase in satisfaction rate per respondent.

Each one-unit increase in the interaction domain was associated with a 0.283-unit

increase in satisfaction rate per respondent (p< .001). Each one-unit increase in the

instruction domain was associated with a 0.382-unit increase in satisfaction rate per

respondent (p< .001). Each one-unit increase in the instructor domain was associated with

a 0.214-unit increase in satisfaction rate per respondent (p< .001). Each one-unit increase

in the course management domain was associated with a 0.034-unit increase in

satisfaction rate per respondent (p< .001). Each one-unit increase in the technology

domain was associated with a 0.083-unit increase in satisfaction rate per respondent (p<

.001).

Beta was strongest for the instruction domain (0.439), which suggests strongest

effect, followed by interaction domain (0.332), instructor domain (0.238), technology


77
domain (0.103), and course management domain (0.059). The instruction domain was

deemed to be a major influencer of their satisfaction. The instruction domain addresses

the communication between the instructor and the students and the extent of the skill of

the instructor in dealing with electronic content. This notable result indicates that the

students acknowledged the influence of the instructor on their blended learning

experiences. Because of the strong relation between the instructor and the students’

satisfaction, it is important to realize which factors influence the instructor’s integration

of technology within their course. The researcher believes that this result can be

explained by the following factors: the level of the ICT competence of the instructor, the

available ICT infrastructure at REU, the instructor’s attitude towards the implementation

of technology in education, the level of instructor-to-student interaction and

communication, and the training on how to implement technology in blended courses.

UNESCO identified instructor education as one of the crucial strategies for integrating

ICTs in education (UNESCO, 2002).

The result of the course management domain is of great concern to this study,

because it reveals that the impact of the course management domain is the least. This

limited effect can be attributed to the fact that the survey questions in this domain were

few and did not reflect the real effect of course management on students’ satisfaction.

However, implementing the appropriate course management strategies is essential to

success in a blended course. These strategies are related to having clear directions, a

strong syllabus, timely feedback, and well-structured materials. Abou-Naaj et al. (2012)

confirmed the existence of a relationship and a direct impact between course management
78
tools and students’ satisfaction. However, Woods, Baker, and Hopper (2004) found that

instructors mainly use LMS as a course management tool to manage the students’ grades

and to post course materials. In the course BICH 221, the dominant course management

usage of Moodle was for uploading course materials, conducting quizzes, and delivering

resources. Student respondents reported that not all the interactive features in Moodle

were used during the course delivery; however, this might be one of the negative effects

on satisfaction. Therefore, outcomes in this area are incompatible with the findings of

other studies, which have reported that course management impacts student satisfaction

with the online portion of the blended course.

Table 9

Regression Analysis of the Five Blended Learning Domains


Unstandardized Standardized
t p value
Coefficients Coefficients
Standard
B Beta
Error
Constant .358 .162 - 2.207 0.028
Interaction Domain .283 .003 .332 104.855 .000
Instruction Domain .382 .003 .439 135.441 .000
Instructor Domain .214 .003 .238 76.388 .000
Course Management Domain .034 .001 .059 24.576 .000
Technology Domain .083 .003 .103 31.304 .000
Note. Created by the author of this thesis.

Multiple linear regression analyses showed that all items in the survey

questionnaire had significant effect on respondent satisfaction (p< .001). Hypothesis 3

was approved. The item “I enjoy working on assignments by myself” had the largest beta

of all variables in the analysis (0.062), followed by “Compared to face-to-face course


79
setting, I am satisfied with this learning experience” (0.057), and “Technical problems

are not frequent and they do not adversely affect my understanding of the course.”

(0.057). The analytical data of the five survey items with the highest beta are shown in

Table 10. The variable with the lowest beta (0.04) was “The quality of your interaction

with other students.”

Table 10

Regression Analysis of the Five Survey Items With the Strongest Effect
Standardize
Unstandardized
d
Coefficients
Coefficients t p value
Standard
B Beta
Error
“I enjoy working on
assignments by .606 .000 .062 1.611E8 0.000
myself”
“Compared to face-to-
face course setting, I
.606 .000 .057 1.185E8 0.000
am satisfied with this
learning experience”
“Technical problems
are not frequent and
they do not adversely
.606 .000 .057 1.275E8 0.000
affect my
understanding of the
course”
“I am satisfied with the
process of
.606 .000 .054 9.096E7 0.000
collaboration activities
during the course”
“Generally, I am more
engaged in my blended .606 .000 .054 9.195E7 0.000
courses"
Note. Created by the author of this thesis.

Figure 6 and Table 11 show the mean (±SD) satisfaction rate per respondent and

the domains scores. The mean (±SD) satisfaction rate per respondent was 77.81%
80
(±11.37). The highest mean (±SD) score was for the Course Management Domain (80%

±19.77), followed by Instructor Domain (79.64% ±12.65), Interaction Domain (79.23%

±13.35), Technology Domain (77.50% ±14.02), and Instruction Domain (75.51%

±13.07). Overall, the findings expressed in Table 11 proved that students were relatively

satisfied with the interaction, instruction, instructor, course management, and technology

domains. These outcomes were consistent with the results of Abou-Naaj et al. (2012).

Figure 6. Percentages of Students’ Responses per Domain.


81
Table 11

Descriptive Statistics of the Satisfaction per Respondent and per Domain


Satisfaction rate per respondent 77.81 11.37 47.88 96.97
Interaction Domain 79.23 13.35 42.22 100.00
Instruction Domain 75.51 13.07 43.33 96.97
Instructor Domain 79.64 12.65 37.14 100.00
Course Management Domain 80.00 19.77 20.00 100.00
Technology Domain 77.50 14.02 40.00 100.00
Note. Values are percentages.

The author of this thesis collected data concerning student satisfaction with the

course BICH 221 from 2008 to 2017 in order to compare the degree of student

satisfaction before and after implementing the blended-learning system. As a result, an

improvement in the level of student satisfaction was detected since the start of the

application period. This is consistent with the findings of Paturusi, Usagawa, and

Lumenta (2016). Their findings indicated that the implementation of blended learning

improves student achievement and performance. This is contrary to what has been

reported by Larson and Sung (2009), who found that there is no difference in student

satisfaction between traditional and blended learning. Table 12 and Figure 7 show the

relation between satisfaction rate per respondent and domains with gender. Although the

mean satisfaction rate per respondent was marginally higher in females than males, this

difference was statistically not significant (p=0.750). Hypothesis 4 was rejected.

The mean satisfaction in the interaction and technology domains was higher in females

than males. However, statistically significant difference was found only in the technology

domain (p=0.009). Males were more satisfied than females with the satisfaction per

instruction and instructor domains. However, this was not statistically significant.
82
Table 12

Descriptive Statistics of Students’ Perception of Domains per Gender


Mean SD Minimum Maximum p*
Satisfaction rate per Male 77.72 11.04 47.88 96.97 0.750
respondent Female 77.86 11.59 53.33 96.97
Interaction Domain Male 78.67 12.53 55.56 100.00 0.386
Female 79.53 13.80 42.22 100.00
Instruction Domain Male 75.87 11.93 43.33 93.33 0.926
Female 75.32 13.67 45.00 96.67
Instructor Domain Male 81.03 13.59 37.14 100.00 0.105
Female 78.91 12.10 48.57 100.00
Course Male 80.00 20.53 20.00 100.00 0.887
Management Female 80.00 19.42 20.00 100.00
Domain
Technology Male 74.40 12.26 53.33 100.00 0.009
Domain Female 79.15 14.65 40.00 100.00

Note. * Based on Mann-Whitney test.

When students were asked to express their opinions regarding the tools used in

the blended course, there was a clear consensus between male and female students

regarding assessment of their experiences with such tools (Figure 7).

There is little variation in favor of women in the technology domain. This

challenges the results of Jones et al. (2009), who stated that female students spent less

time online than their counterparts. In a survey conducted by Huang, Hood, and Yoo’s

(2012) about learning technologies, they found that males showed more confidence

working with computer applications and web portals than females, while female students

expressed greater confidence in social media platforms and e-mail interaction (Jones et

al., 2009; Huang, Hood, & Yoo, 2012).

On the other hand, Huang, Hood, and Yoo (2012) mentioned that the use of

interpersonal, communication, and education platforms among females is more common,


83
while males tend to use the Internet for entertainment activities. This may explain the

slight variation that was detected during our analysis.

Figure 7. A comparison between male and female per domain.

Analysis of the Open-ended Questions

This subsection addresses the analysis of participants’ responses to the three

open-ended questions that examine their opinions on the course site, the advantages and

the disadvantages. Unfortunately, the majority of respondents did not comment on these

queries. A total of only five comments were provided. Through the observations, students

reported their need for training on how to deal with the blended learning environment.

Retrospective Analysis

The researcher used retrospective analysis to assess Hypothesis 1 and Hypothesis

2. Nine academic years in which the course BICH 221 was offered were included in the
84
study. The total number of students registered in the course in the nine academic years

was 3530 students. The average number of students registered in the course was

approximately 392 students, ranging from 191 to 585 students, per academic year.

Blended learning was implemented in the three most recent academic years. Details of

the pass/fail/drop rates, as well as course satisfaction rates, are shown in Table 13.

Table 13

Percentages of Students’ Outcomes (2008-2017)


Number of
Blended Pass Fail Drop Course
Academic Students
Learnin Rate Rate Rate Satisfactio
Year Registered in
g (%) (%) (%) n Rate (%)
the Course
2008-2009 No 191 86.38 9.42 4.18 79.90
2009-2010 No 287 76.65 17.42 5.92 74.21
2010-2011 No 330 86.40 6.90 6.34 77.42
2011-2012 No 380 81.58 11.05 7.37 75.65
2012-2013 No 585 87.01 6.32 6.67 76.67
2013-2014 No 445 84.72 10.56 4.72 79.30
2014-2015 Yes 478 80.33 12.13 7.53 82.18
2015-2016 Yes 556 85.07 12.23 2.70 82.55
2016-2017 Yes 278 86.33 8.99 4.68 87.62
Note. The shaded area represents the years of the application of blended learning.

In 2014, REU introduced blended learning, which represented a twist in the

instruction system of the university. In the years 2014-2017, when the integration of the

e-learning was finalized, there was a profound awareness of the favorable development of

student outcomes in the blended course. Analysis shown in Table 13 revealed an

improvement in the student satisfaction rate, rising to 88% in 2017. Such results

corresponded to previous studies (Lopez-Perez et al., 2011; Sajid et al., 2016; Abou-Naaj
85
et al., 2012; Wu et al., 2010). Researchers concluded that the implementation of blended

learning positively affected student satisfaction.

In Table 14, Spearman’s correlation shows a statistically significant positive

correlation between the academic year and the course satisfaction rate (p=0.036). The

correlation tests between academic years and pass rates, fail rates, drop rates, and all

grade rates showed no statistical significance.

Table 14

Correlations of Academic Year with Students’ Outcomes and General Satisfaction for
the Course BICH 221 (2008-2017)
Academic year
Pass Rate (%) rho -0.050
p value 0.898
Percentage of Passed Students with Grade “A” rho 0.367
p value 0.332
Percentage of Passed Students with Grade “B” rho 0.333
p value 0.381
Percentage of Passed Students with Grade “C” rho -0.400
p value 0.286
Percentage of Passed Students with Grade “D” rho -0.167
p value 0.668
Fail Rate (%) rho 0.033
p value 0.932
Drop Rate (%) rho -0.100
p value 0.798
General Satisfaction Rate (%) rho 0.700
p value 0.036*
Note. Statistical significance based on Spearman’s correlation.

The mean (±SD) pass rate, percentage of students with Grade A, percentage of

students with Grade B, percentage of students with Grade C, fail rate, and general

satisfaction rate were higher with blended learning implementation. However, statistical

significance was only found with the general satisfaction rate (p=0.020, Mann-Whitney U
86
test). The mean (±SD) percentage of passed students with Grade D and the drop rate were

lower with implementation of blended learning. This relation was not statistically

significant (Table 15). Based on these findings, Hypothesis 1 was accepted and

Hypothesis 2 was rejected.

Table 15

Descriptive and Analytical Statistics of Students’ Outcomes and Satisfaction With and
Without Blended Learning
Mann-
p value
Mean SD Median Whitney
(exact p)
U
Pass Rate (%) Yes 83.91 3.16 85.07 0.606
7
No 83.79 4.02 85.55 (0.714)
Passed Students with Grade "A" Yes 12.98 2.44 12.47 0.197
14
(%) No 10.12 1.89 9.98 (0.269)
Passed Students with Grade "B" Yes 25.16 4.28 26.30 0.796
10
(%) No 24.32 4.40 25.97 (1.000)
Passed Students with Grade "C" Yes 35.66 4.75 34.64 1.000
9
(%) No 34.61 4.22 34.58 (1.000)
Passed Students with Grade "D" Yes 26.21 2.42 27.27 0.197
(%) 30.88 4
No 29.62 4.55 (0.262)
Fail Rate (%) Yes 11.12 1.84 12.13 0.439
12
No 10.28 3.99 9.99 (0.548)
Drop Rate (%) Yes 4.97 2.43 4.68 0.606
7
No 5.87 1.21 6.13 (0.714)
Course Satisfaction Rate (%) Yes 84.12 3.04 82.55 0.020
18
No 77.19 2.16 77.05 (0.024)*
Note. * Statistical significance. Numbers shown in the descriptive statistics are
percentages.

The researcher attributed the significant correlation between blended learning and

the improvement in satisfaction rate to the fact that this system combined the advantages

of both electronic and conventional learning. Blended learning presented several

opportunities for the learners to train in a continuous manner with increased access to

information. In addition, it allowed them to interact directly with their colleagues and
87
their instructor to inquire about the course material. Through blended learning, the learner

can employ more than one means of acquiring knowledge and choose the appropriate

tools that suit their abilities and skills.

On the other hand, the results above failed to confirm the existence of a

correlation between the implementation of blended learning and achievement or

performance.

Previous results are contrary to the findings of Lin (2009), who asserted that the

implementation of blended learning improved students’ learning outcomes. Furthermore,

Dziuban, Hartman, and Moskal (2004) confirmed that the successful integration of face-

to-face and online instructional methods in blended learning courses may augment

students’ learning outcomes. A study carried out by Jee and O’Connor (2014) showed

that blended learning application has increased the proficiency scores of the participating

learners. Yuen et al. (2009) likewise stated that compared to the traditional education

system alone, a blended learning system can provide students with a unique learning

experience, thus improving student satisfaction and performance. Similarly, Garrison and

Kanuka (2004) showed that the application of an efficient blended system may certainly

increase students’ examination results. Mclaughlin et al. (2015), Kenney & Newcombe

(2011), Lopez-Perez et al. (2011), Donnelly (2010), Vaughan (2007), Taradi et al. (2005),

Woltering et al. (2009), and Reasons, Valadares, and Slavkin (2005) also agreed with the

previous opinions.

However, the failure of this research to show a significant impact of blended

learning on student performance can be attributed to the first year the system was
88
implemented (2014). Students were not intellectually or mentally prepared to adjust to

the unprecedented learning style. For instance, the drop rate was the highest in the year

2014, the first year of the implementation. Similarly, the passing rate was the second

lowest in that year. Recent years of implementation showed better performance than the

first year. The reason for that might be unfamiliarity with the system for both students

and instructor or the instructor’s lack of pedagogical and technical skills.

The result from this research is in alignment with prior studies, which concluded

that there is no significant variation in students' performance between traditional and

blended learning (Kwak, Menezes, & Sherwood, 2015; Li et al., 2014). With regard to

this outcome, the researcher believes that this result is because of the limited number of

years in which blended learning has been implemented at REU, which influences

students’ learning achievement and motivation.

Focus Group Data Analysis

The preceding sections dealt with quantitative data analysis through surveys and

retrospective data, while this section discusses the qualitative findings of face-to-face

interviews with the students. The interview with the students was organized to obtain

deeper understanding of blended learning at REU and to supplement the quantitative

results. The interview questions explore students’ opinions on the digital learning

platform as a supplementary learning resource for the primary learning model. Eight

students signed in to participate. During the interviews, comments were recorded to

recognize important themes and to follow interesting answers. The following themes

form the framework of this section: characteristics, advantages, and disadvantages.


89
Course Site Characteristics

The first category in the interview pertained to the characteristics of the digital

learning platform and its impact on students' experiences and to understand the students’

perception regarding the usefulness of the course site. Applying the coding process to the

raw data, three categories emerged under the site characteristics theme. These categories

covered the course site content, the students’ interaction, and the course site

management/administration.

The answers showed that there is consensus on the benefit gained from the

content of the course site, such as presentations, course syllabus, quizzes, and external

resources. Many interviewees believed that having a number of course-related resources

and materials on the course site helped them to be more organized and time-efficient. For

example, Participant S5 said, “The learning platform provides more space to effectively

operate the lecture time and support the course subjects via additional and external

references.” Participants S2, S6, and S8 reported that they felt overloaded by all these

extra activities. Participant S1 asserted that it was helpful to use videos to support lecture

materials.

With respect to the level of communication, interaction and collaboration in the

blended learning environment, participants confirmed that the digital e-learning platform

led to improved communication and interaction with their instructor and their colleagues.

During the interview, the majority of students asserted that the course site served

as a continuous channel of communication with their instructor through discussion forum,

comments box, and e-mail. For instance, S3and S6 stated that the site was a way of
90
presenting announcements, setting up exam schedules, and posting some notes and alerts,

as when the course director changes the lecture time or location. In the same manner,

some students reported that the discussion forum was rarely used during this course,

although the participants believed that the discussion forums would provide an

opportunity to initiate dialogue, exchange views, and facilitate communication with the

instructor outside the university boundaries.

Concerning the course management/administration category, there was

unanimous agreement that the usage of this category was for resource delivery, course

documents, online gradebook, syllabus publication, and communication via e-mail. S1,

S5, S6, and S8 affirmed that they had performed their assignments and sent them back to

the instructor through Moodle. However, the entire group of participants confirmed that

most of the interactive features in Moodle were largely unused.

In summary, the interview data showed that the digital learning platform was a

genuine source of support for the provision of part of the course burden, which had

previously been provided during the class hour. By accessing course-related material on

the course site, students were gaining a series of benefits, such as improved time-

management, better communication, and greater productivity. Moreover, it has become

clear that the majority of participants are satisfied and receptive to the opportunities

presented by the digital platform in boosting communication.

Course Site Advantages

The second category examined the advantages of the learning platform by

investigating the benefits of the learning, teaching, and communication aspects. This
91
section investigates whether the students had observed any changes in their outcomes

(examination grades). This domain was also examined through retrospective and

quantitative data. There was a variety of responses concerning this question. Some felt

that the course site had an impact on improving their learning achievements, while others

asserted that they did not notice any difference in their performance after implementing

the blended learning system. For example, S3 and S8 were uncertain of the impact of the

course site on their academic performance. In addition, S7 asserted that he had difficulty

in dealing with the site, which had a negative impact on his performance.

In terms of the positive impact of the digital platform, most participants

commented that the educational material is easily accessible, which enabled students to

prepare in advance or follow up on what they missed. Participant S5 stated that they were

interested in performing quizzes online so as to save the lecture time for other purposes.

The collected data emphasized that the presence of additional activities was a favorable

aspect of content selection and curriculum delivery. Further, the data revealed that the

site helped to relieve some load of the classroom activities, so that the teacher is able to

use the lecture time more effectively. The students noted that the course site improved

instructor interaction with them, which had a role in motivating them.

With regard to the advantages of the course site in communication, there was a

consensus among interviewees that the blended course improved instructor-student and

student-student interactions. Moreover, some participants acknowledged that the

availability of interactive features and online communication facilitated discussion and

collaboration with the instructor outside the lecture. For instance, Participant S8 stated
92
that the course instructor used the course site to publish announcements and assignments

that required students to interact and respond to them. Similarly, S1 asserted that the

course site enhanced the interaction, stating: “There are many features within Moodle

that allowed us to stay in touch with the course director, but of course, all these activities

required the instructor to devote part of his time to interacting with students.”

Course Site Disadvantages

The final category was concerned with the factors that negatively affected

students' experiences with blended learning. The researcher classified data within this

category into three areas, course management/administration, technical issues, and areas

belonging to students’ experiences.

The interviewees reported the following issues:

 Computer proficiency and competency. All participants confirmed that


lacking technological proficiency had a major influence on their experiences
with Moodle. Further, Participant S8 mentioned, “It is important to have basic
knowledge of computer hardware to deal with e-learning systems.”

 Technical issues. Interviewees mentioned that having a strong Internet


connection or high bandwidth is required for working with the course site.
Notably, some students pointed out that they found it difficult to keep pace
with the technical requirements of the blended course.

 Course management/administration. According to participants, there are many


factors related to this category, including the instructor’s lack of computer
competency, the instructor’s having insufficient time to communicate with
students or to develop online material, copyright restrictions and procedures, and
the lack of motivation. S3 stated that they were overloaded with the amount of
extra activities required by this course. The majority of participants agreed that
the instructor did not use and apply many of the features available via Moodle.

In summary, the course site was a helpful support to students' learning and course

delivery. Participants affirmed that the course site helped them to expand their learning
93
experiences, access course material at any time, and improve communication with the

course director. The students affirmed that the course site was regularly updated. The

findings reveal a common acceptance by the majority of students of the convenience of

the blended learning system; this was also confirmed in the quantitative data. The

students identified several benefits of the course site, such as enhanced delivery of the

course, increased flexibility, and reduced cost. The interviewees attributed the difficulty

of dealing with the digital platform (course site) to the fact that blended learning is still

something new for them. This corresponds to the reality of the current situation in which

the integration of e-learning is still in the preliminary stage of educational and

pedagogical change.
94
CHAPTER 5

SUMMARY, CONCLUSION, AND RECOMMENDATIONS

Summary

This study scrutinizes the quality of a blended learning environment at Riyadh

Elm University (REU) in Riyadh, Saudi Arabia by studying and investigating students'

experiences with the e-learning site. The learning sites are a means of delivering the

content of a course that is complementary to what is being taught to students in the

conventional education (face to face). Tracking the scientific and technological

developments taking place in the world, Riyadh Elm University responded by integrating

technology in the education process. In 2014, the transformation of the curriculum

delivery was achieved through the Moodle platform.

This chapter encapsulates the findings of this research. Further, it focuses

attention on the study limitations. In the end, a number of recommendations are

suggested for further research.

The first chapter covered several aspects for solving the research problem: the

introduction, the study problem, its purpose, and the theoretical framework. This research

is the first of its kind in the university, looking at the effectiveness of the blended

learning system and the factors that might improve the quality of learning, teaching, and

communication at REU. The primary question for this research was: How do students at

REU regard the application of blended learning?


95
Chapter 2 reviewed related literature and experimental studies in pedagogy and

the integration of ICTs in education. Nowadays, technology plays a critical role in the

education process, placing students at the core and leading to changes in many

educational policies and practices. The early 21st century featured the emergence of the

blended learning approach as a contemporary learning style (Vo et al., 2017; Masi &

Winer, 2005). Additionally, research results confirmed that students are generally pleased

with their involvement in blended learning environments, including communication with

peers and instructors (Knight & Allen, 2012; Yuen et al., 2009; Hatch et al., 2004;

Graham et al. 2005; Garrison & Vaughan, 2008; Iqbal et al., 2011; Holley & Oliver,

2010; Almalki, 2011).

In order to thoroughly explore the research question, Chapter 3 discussed a

blended quantitative and qualitative research method. It also established a blueprint for

collecting, measuring, and analyzing data. The methodology applied to perform this study

was the PDSA cycle. Optimum blended courses were determined based on a quality

assessment checklist. Eventually, one course was selected as a sample for this study.

Within this course, 308 participants received the survey form; only 216 valid forms were

filled, returned, and submitted for analysis. To delve deeply into the research problem,

two types of data were collected from students, quantitative and qualitative. All the

research data have been tested for validity and reliability.

Chapter 4 presented the analysis and discussion of Chapter 3 findings. The

participants perceived the usefulness of the digital platform, especially in improving their

communication with the teacher and facilitating their access to the course materials.
96
During the study, a number of obstacles were identified that negatively affected students'

experience with the course site as a learning medium, such as the inadequate ICT

infrastructure and lack of technical support and computer competency. Students reported

high satisfaction with the five main domains that were examined to evaluate the

effectiveness of the course site, which are interaction, instruction, instructor, technology

(ICTs), and course management/administration. The retrospective results were obtained

by analysing data from historical records for the period beginning 2013 until 2017. A

significant relationship was found between the students’ course satisfaction and the

implementation of blended learning. The qualitative findings were obtained from students

interviewed regarding the research problem. The results showed that the course site was

useful for course delivery and intercommunication.

The students’ responses are indicators of their approval of this novel learning

modality, and their expressions concerning the online part of the blended course are a

helpful indicator with respect to continuous improvement. The findings of this study

demonstrate that students are in agreement that blended learning at REU is beneficial.

This corresponds with the body of literature in this context. Overall, the results were

positive, and the students reported highly favorable opinions regarding their experiences

with blended learning.


97
Conclusion

The blended learning method is able to provide high-quality learning in higher

education. Riyadh Elm University (REU) has integrated e-learning as a complementary

learning method to traditional learning, as have many higher education institutions in

Saudi Arabia. Despite the integration of ICT in the education system at REU, few staff

complied with the change and developed their blended learning environments. The

integration induced many anxieties relevant to the quality of learning; consequently,

several guidelines were enforced about quality assurance and developing recognition for

this form of learning. Therefore, it was necessary to assess and monitor the quality of the

blended learning system at REU. The Deming Cycle (PDSA) can be a helpful tool for use

by an independent quality assurance centre at an academic institution to assess and

improve the quality of learning. Assessment is an essential criterion for sustainable

improvement of the blended learning system. This study identified the negative outputs to

be dealt with in the second improvement cycle. The implementation of blended learning

greatly improves students’ satisfaction and their learning experiences. Particularly,

blended learning can expand communication and interaction between instructor and

students, which ultimately would support the learning/teaching process. Moreover, the

instructor’s experience and support are particularly important to enhance student

satisfaction. Students will probably enjoy participation, the level of effort, working on

assignments, and communication in the blended learning environment. However, blended

learning may exert no impact on students’ academic accomplishments.


98
The satisfaction survey is a useful tool to weigh the various elements of

satisfaction with blended learning, especially if all domains are well represented in the

survey questions. Although most students may feel satisfied about blended learning,

some students prefer the face-to-face instruction modality over blended learning. A

comparable number of students still believe in the use of ICTs in learning with various

degrees of utilization. In particular, Moodle tools are useful and easy to use but may not

be used randomly. Tools should be used to meet the objectives of the course and to

achieve students' learning outcomes.

Satisfaction in blended learning is the same across both genders; however,

technology in blended learning is more likely to be appreciated by female students.

Several aspects may contribute to the successful implementation of blended learning:

 Adequate computer and technological skills for both instructor and students.

 Administrative support.

 Professional development of staff.

 A well-designed structure for technical support; it is observed that technical


issues are best addressed properly before implementing blended learning.

 A mechanism that assesses and monitors the e-learning system.

 A policy that encourages the integration of e-learning within learning and


teaching.

This research shows that blended learning system is a promising approach with

positive potential in the field of higher education, if implemented correctly and monitored

periodically.
99
Recommendations

Stage 4: Act

This section addresses the “Act” phase of the PDSA cycle. Based on the identified

findings (root causes) in the prior chapter (Study phase), the researcher recommends the

following action plans be implemented in the subsequent planning phase:

 REU must focus on the quality of e-learning by formulating a manual that


manages the education, monitors the program, and periodically assesses it to
ensure linking of the usage of the e-learning environment with learning
objectives and outcomes.

 A course management/administration strategy or a plan must be designed to


improve the structure of the online learning platform, such as providing
students with timely feedback, clear instruction, and a clear syllabus.
 REU must apply a quality model such as PDSA to continuously monitor and
assess the learning/teaching process.

 In designing a blended learning environment, the university must offer


training programs to enhance teachers' skills in developing and displaying
course materials.

 The University must raise awareness of the importance of e-learning and its
role in stimulating the education process and improving the outcomes of
students.

 For future research, a thorough investigation and evaluation should be


undertaken at the university level.
REFERENCES
101
REFERENCES

Abou-Naaj, M., Nachouki, M., & Ankit, A. (2012). Evaluating Student Satisfaction with

Blended Learning in a Gender-Segregated Environment. Journal of Information

Technology Education: Research, 11, 185-200.

Aggarwal, A., & Lynn, S. (2012). Using Continuous Improvement to Enhance an Online

Course. Decision Sciences Journal of Innovative Education, 10(1), 25-48.

Akyol, Z., Vaughan, N., & Garrison, D. R. (2011). The impact of course duration on the

development of a community of inquiry. Interactive Learning Environments, 19(3),

231–246.

Albalawi, M. S. (2007). Critical factors related to the implementation of web-based

instruction by higher-education faculty at three universities in the Kingdom of Saudi

Arabia (Doctoral dissertation, University of West Florida).

Alebaikan, R. (2010). Perceptions of Blended Learning in Saudi Universities, PQDT -

UK & Ireland.

Alexander, S. (1999). An evaluation of innovative projects involving communication and

information technology in higher education. Higher Education Research &

Development, 18(2), pp. 173–182.

Alfahad, F. N. (2012). Effectiveness of using information technology in higher education

in Saudi Arabia. Procedia-Social and Behavioral Sciences, 46, 1268-1278.

Almalki, A. M. (2011). Blended learning in higher education in Saudi Arabia: A study of

Umm Al-Qura University. Retrieved from

https://researchbank.rmit.edu.au/eserv/rmit:14613/Almalki.pdf
102
Alruwaih, M. E. (2015). Effect of blended learning on student's satisfaction for students

of the public authority for applied education and training in Kuwait. International

Journal Science, Movement and Health, 15(2).

American Society for Quality. (n.d.). Continuous improvement. Retrieved from

http://asq.org/learn-about-quality/continuous-improvement/overview/overview.html

Arbaugh, J. B. (2014). What might online delivery teach us about blended management

education? Prior perspectives and future directions. Journal of Management

Education, 38(6), 784-817.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, R., Ice, P., Richardson, J., &

Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a

measure of the community of inquiry framework using a multi-institutional sample.

Internet and Higher Education, 133-136. doi: 10.1016/j.iheduc.2008.06.003

Armstrong, D. (2011, October). Students’ perceptions of online learning and instructional

tools: A qualitative study of undergraduate students’ use of online tools. In E-Learn:

World Conference on E-Learning in Corporate, Government, Healthcare, and

Higher Education (pp. 1034-1039). Association for the Advancement of Computing

in Education (AACE).

Azizan, F. Z. (2010). Blended learning in higher education institution in Malaysia.

In Proceedings of Regional Conference on Knowledge Integration in ICT 2010, pp.

454–466.

Baba, N., Sa’ari, H., Daud, S. C., Adenan, H., & Kamarulzaman, S. H. (2014, October).

Students' Perceptions and Readiness in Practicing Blended Learning in an Institution


103
of Higher Education in Malaysia. In ICICKM2014-Proceedings of the 11th

International Conference on Intellectual Capital, Knowledge Management and

Organizational Learning: ICICKM2014 (p. 39). Academic Conferences Limited.

Babb, S., Stewart, C., & Johnson, R. (2010). Constructing communication in blended

learning environments: students’ perceptions of good practice in hybrid courses.

MERLOT Journal of Online Learning and Teaching, 6(4), 735-753.

Bailey, K. D. (2002). The effects of learning strategies on student interaction and student

satisfaction. Dissertation Abstract international, 63(7).

Begičević, N., Divjak, B., & Hunjak, T. (2007). Prioritization of e-learning form: a multi

criteria methodology. Central European Journal of Operations Research. 15(4),

405.

Belanger, F. (Ed.). (1999). Evaluation and Implementation of Distance Learning:

Technologies, Tools and Techniques: Technologies, Tools and Techniques. IGI

Global.

Bhuiyan, N., & Baghel, A. (2005). An overview of continuous improvement: from the

past to the present. Management decision, 43(5), 761-771.

Bichelmeyer, B. (2003). Checklist for formatting checklists. Kalamazoo, MI: The

Evaluation Center, Western Michigan University.

Birmingham, P., & Wilkinson, D. (2003). Using research instruments: A guide for

researchers. Routledge.

Boklaschuk, K., & Caisse, K. (2001). Evaluation of educational web sites. Retrieved

from http://members.fortunecity.com/vqf99
104
Bollinger, D. U., & Martindale, T. (2004). Key factors for determining student

satisfaction in online courses. International Journal of E-Learning, 3(1), 61-67.

Bonk, C. J., & Graham, C. R. (2012). The handbook of blended learning: Global

perspectives, local designs. John Wiley & Sons.

Bower, B. L., & Kamata, A. (2000). Factors influencing student satisfaction with online

courses. Academic Exchange Quarterly, 4(3), 52-56.

Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and

code development. sage.

Boyle, T., Bradley, C., Chalk, P., Jones, R., & Pickard, P. (2003). Using blended learning

to improve student success rates in learning to program. Journal of Educational

Media, 28(2–3), pp. 165–178.

Braun, V., & Clarke, V. (2013). Successful qualitative research: A practical guide for

beginners. Sage. pp. 206.

Brega, A. G., Barnard, J., Mabachi, N. M., Weiss, B. D., DeWalt, D. A., Brach, C., &

West, D. R. (2015). AHRQ health literacy universal precautions toolkit. Rockville,

MD: Agency for Healthcare Research and Quality.

Brill, J., & Galloway, C. (2007). Perils and promises: University instructors’ integration

of technology in classroom-based practices. British Journal of Education

Technology, 38(1), 95.

Brophy, J. (1999). Teaching. Educational practices series--1. International Academy of

Education & International Bureau of Education, Vol 1.


105
Brown, J. F., & Marshall, E. L. (2008). Continuous quality improvement: An effective

strategy for improvement of program outcomes in a higher education setting.

Nursing Education Perspectives, 29(4), 205-211.

Bryman, A. (2016). Social research methods. Oxford University Press.

Buć, S., & Divjak, B. (2015, January). Innovation Diffusion Model In Higher Education:

Case Study Of E-Learning Diffusion. Proceedings of the IADIS International

Conference e-learning. In International Conference on e-Learning.

Burns, M. (2016). Ensuring quality online learning for teachers. Retrieved September 15,

2018, from https://www.globalpartnership.org/blog/ensuring-quality-online-

learning-teachers

Calderon, O., Ginsberg, A., & Ciabocchi, L. (2012). Multidimensional assessment of

pilot blended learning programs: maximizing program effectiveness based on

student and faculty feedback. Journal of Asynchronous Learning Networks, 16(3),

pp. 23-37.

Carmel, A., & Gold, S. (2007). The effects of course delivery modality on student

satisfaction and retention and GPA in on-site vs. hybrid courses. Turkish Online

Journal of Distance Education-TOJDE, 8(2: Article 11), 127-135.

Caruso, J. B., & Salaway, G. (2007). The ECAR study of undergraduate students and

information technology, 2007. Retrieved December, 9, 2017.

Cavanagh, T. B. (2012). The postmodality era: How ‘online learning’ is becoming

‘learning.’ In Oblinger, D. (Ed.), Game Changers: Education and Information

Technology (215-228). Boulder, CO: EDUCAUSE


106
Chambers, M. (1999). The efficacy and ethics of the use of digital multimedia for

educational purposes. The convergence of distance and traditional modes of higher

education', London and New York: Routledge, pp. 5–17.

Chao, T., Saj, T., & Tessier, F. (2006). Establishing a quality review for online courses.

Educause Quarterly, 29(3), 32.

Chen, Clement C., & Jones, Keith T. (2007). Blended learning vs. traditional classroom

settings: assessing effectiveness and student perceptions in an MBA accounting

course. Journal of Educators Online, 4(1), n1.

Cho, M., & Kim, B. J. (2013). Students’ self-regulation for interaction with others in

online learning environments. Internet and Higher Education, 17, 69‒75.

Cho, M. H., & Tobias, S. (2016). Should instructors require discussion in online courses?

Effects of online discussion on community of inquiry, learner time, satisfaction, and

achievement. The international review of research in open and distributed

learning, 17(2).

Coates, H. (2010). Development of the Australasian survey of student engagement

(AUSSE). Higher Education, 60(1), 1-17.

Colton, D., & Covert, R. (2013). Designing and constructing instruments for social

research and evaluation. John Wiley and Sons, 1, 64-86.

Cox, S., Wilcock, P., and Young, J. (1999). Improving the repeat prescribing process in a

busy general practice: A study using continuous quality improvement methodology.

BMJ Quality & Safety, 8(2), 119-125.


107
Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualitative, quantitative, and

mixed methods approaches. Sage publications.

Creswell, J.W., & Miller, D.L. (2000). Determining Validity in Qualitative Inquiry.

Theory into Practice, 39, 124-130.

Creswell, J. W., & Clark, V. L. P. (2017). Designing and conducting mixed methods

research. Sage publications.

Creswell, J. W., & Poth, C. (2016). Qualitative inquiry and research design: choosing

among five approaches. SAGE Publications Asia-Pacific Pte. Ltd.

Creswell, J. W., & Tashakkori, A. (2007). Editorial: differing perspectives on mixed

methods research. Journal of Mixed Methods Research, 1(4), 303-308.

Cuban, L. (1999). High-tech schools, low-tech teaching. The Education Digest, 64(5), 35.

Davidson-Shivers, G. V. (2009). Frequency and types of instructor-interactions in online

instruction. Journal of Interactive Online Learning, 8(1), 23–40.

Davis, R., & Wong, D. (2007). Conceptualizing and measuring the optimal experience of

the eLearning environment. Decision Sciences Journal of Innovative Education,

5(1), 97-126.

De-Bourgh, G. (2003). Predictors of student satisfaction in distance-delivered graduate

nursing courses: What matters most? Journal of Professional Nursing, 19(3), 149-

163.

Deming, W.E. (1986). Out of the Crisis. Cambridge, MA, Massachusetts Institute of

Technology. Center for Advanced Engineering Study.


108
Deperlioglu, O., & Kose, U. (2013). The effectiveness and experiences of blended

learning approaches to computer programming education. Computer Applications in

Engineering Education, 21(2), 328-342.

Divjak, B., & Redep, N. B. (2015). Strategic Decision Making Cycle in Higher

Education: Case Study of E-Learning. International Association for Development of

the Information Society.

Donnelly, R. (2010). Harmonizing technology with interaction in blended problem-based

learning. Computers & Education, 54(2010), 350–359.

Doyle, L., Brady, A. M., & Byrne, G. (2009). An overview of mixed methods research.

Journal of Research in Nursing, 14(2), 175-185.

Dziuban, C., Hartman, J., Cavanagh, T. B., & Moskal, P. D. (2011). Blended courses as

drivers of institutional transformation. In Blended learning across disciplines:

Models for implementation (pp. 17-37). IGI Global.

Learning, B., Dziuban, C. D., Hartman, J. L., & Moskal, P. D. (2004). Electronic

resource. Center for Applied Research. Research Bulletin, (7).

Dziuban, C., & Moskal, P. (2011). A course is a course is a course: Factor invariance in

student evaluation of online, blended, and face-to-face learning environments.

Internet and Higher Education, 14, 236-241.

Dziuban, C., Moskal, P., Kramer, L., & Thompson, J. (2013). Student Satisfaction with

Online Learning in the Presence of Ambivalence: Looking for the Will-o'-the-Wisp.

Internet and Higher Education, 17, 1-8.


109
Education Evaluation Commission. (2016). About EEC. Ministry of Higher Education,

Saudi Arabia. Retrieved from http://www.eec.gov.sa/

Ehlers, U. (2004). Quality in e-learning. The learner as a key quality assurance category.

European Journal of Vocational Training, 29, 3-15.

El-Mowafy, A., Kuhn, M. & Snow, T. (2013). Blended learning in higher education:

Current and future challenges in surveying education. In Special issue: Teaching and

learning in higher education: Western Australia's TL Forum. Issues in Educational

Research, 23(2), 132-150. http://www.iier.org.au/iier23/el-mowafy.html

El-Tigi, M. (2001). Integrating WWW technology into classroom teaching: college

students' perceptions of course web sites as an instructional resource. Doctoral

dissertation, Syracuse University, Syracuse, NY.

Ercan, I., & Kan, I. (2004). Reliability and validity in the scales. Uludağ Üniversitesi Tıp

Fakültesi Dergisi, 30(3), 211-6.

Everitt, B. S., & Palmer, C. R. (2005). Encyclopaedic companion to medical statistics.

Hodder Arnold; London.

Flumerfelt, S., & Green, G. (2013). Using Lean in the Flipped Classroom for At Risk

Students. Educational Technology & Society, 16(1), 356–366.

Futch, L., & Dziuban, Charles. (2005). A Study of Blended Learning at a Metropolitan

Research University, ProQuest Dissertations and Theses.

Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and

practice. Routledge.
110
Garrison, R. (2000). Theoretical challenges for distance education in the 21st century: A

shift from structural to transactional issues. The International Review of Research in

Open and Distributed Learning, 1(1).

Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry

framework: Review, issues, and future directions. Internet and Higher Education,

10, 157–172. doi: 10.1016/j.iheduc.2007.04.001.

Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative

potential in higher education. The Internet and Higher Education, 7, 94-106.

doi:10.1016/j.iheduc.2004.02.001

Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education:

Framework, principles, and guidelines. John Wiley & Sons.

Gazza, E. A. (2015). Continuously Improving Online Course Design using the Plan-Do-

Study-Act Cycle. MERLOT Journal of Online Learning and Teaching, 11(2).

Gedik, N., Kiraz, E., & Ozden, M. (2013). Design of a Blended Learning Environment:

Considerations and Implementation Issues. Australasian Journal of Educational

Technology, 29(1), 1-19.

Ghadiri, K., Qayoumi, M. H., Junn, E., Hsu, P., & Sujitparapitaya, S. (2013). The

transformative potential of blended learning using MIT edX’s 6.002 x online MOOC

content combined with student team-based learning in class. EdX Environment, 8,

14.

Gillham, B. (2008). Developing a questionnaire. A&C Black.


111
Glesne, C., & Peshkin, A. (1992). Being there: Developing understanding through

participant observation. Becoming Qualitative Researchers: An Introduction. White

Plains, NY: Longman, 39-61.

Gorenflo, G., & Moran, J. W. (2010). The ABCs of PDCA. Public Health Foundation.<

http://www. phf. org/resourcestools/Pages/The_ABCs_of_ PDCA. aspx.

Graham, C. (2006). Blended learning systems. Definition, current trends, and future

directions. The Handbook of blended learning: Global perspectives, local designs.

3–21.

Graham, C. R. (2009). Blended learning models. In Encyclopedia of Information Science

and Technology, Second Edition (pp. 375-382). IGI Global.

Graham, C. (2012a). Blended learning systems: Definition, current trends, and future

directions. In C. J. Bonk & C. R. Graham (Eds.), The Handbook of blended learning:

Global perspectives, local designs (pp. 3-21). San Francisco, CA: John Wiley &

Sons.

Graham, C. (2012b). Emerging practice and research in blended learning. M.J. Moore

(Ed.), Handbook of distance education (pp. 333-350). New York, NY: Routledge.

Graham, C., Allen, S., & Ure, D. (2005). Benefits and challenges of blended learning

environments. In M. Khosrow-Pour (Ed.), Encyclopedia of information science and

technology. Hershey, PA: Idea Group.

Graham, C. R., & Robison, R. (2007). Realizing the transformational potential of blended

learning: Comparing cases of transforming blends and enhancing blends in higher

education. Blended learning: Research perspectives, 83-110.


112
Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework

for mixed-method evaluation designs. Educational Evaluation and Policy Analysis,

11(3), 255-274.

Gueorguiev, T. (2006). Quality management in higher education. Quality of higher

education.

Güzer, B., & Caner, H. (2014). The past, present and future of blended learning: an in

depth analysis of literature. Procedia-social and behavioral sciences, 116, 4596-

4603.

Haddad, W. D., & Jurich, S. (2002). ICT for education: prerequisites and

constraints. Technologies for Education: Potencials, Parameters and Prospect.

Washington, DC: UNESCO-Academic for Educational Development, 42-57.

Harding, A., Kaczynski, D., & Wood, L. (2012, October). Evaluation of blended

learning: analysis of qualitative data. In Proceedings of The Australian Conference

on Science and Mathematics Education (formerly UniServe Science Conference).

Hara, N., & Kling, R. (2000). Students’ distress with a web-based distance education

course: An ethnographic study of participants’ experiences. Information,

Communication and Society, 3(2000), 557–579.

Hatch, T., Bass, R., Iiyoshi, T., & Pointer-Mace, D. (2004). Building knowledge for

teaching and learning: the promise of scholarship of teaching in a networked

environment. Change, 36(5), 42-49.

Haughey, M., & Anderson, T. (1998). Networked learning: The pedagogy of the Internet.

Montreal: Cheneliére/McGraw-Hill.
113
Herrington, A. J., Schrape, J., & Singh, K. (Eds.). (2012). Engaging students with

learning technologies. Curtin University.

Hew, K. F., Cheung, W. S. & Ng, C. S. L. (2010). Student contribution in asynchronous

online discussion: A review of the research and empirical exploration. Instructional

Science, 38(6), 571-606.

Vo, H. M., Zhu, C., & Diep, N. A. (2017). The effect of blended learning on student

performance at course-level in higher education: A meta-analysis. Studies in

Educational Evaluation, 53, 17-28.

Hilal, M., & Qamar, E. (2001). A future vision for the use of technological innovations in

the field of active sports. The world of education, 4(2), pp. 172.

Hinssen, P., & Chellam, M. (2010). The New Normal: Explore the limits of the digital

world. Mach media.

Holden, J. T., & Westfall, P. J. (2006). Instructional media selection for distance

learning: A learning environment approach. Distance Learning, 3(2), 1.

Holley, D., & Dobson, C. (2008). Encouraging student engagement in a blended learning

environment: The use of contemporary learning spaces. Learning, Media, &

Technology, 33(2), 138-151.

Holley, D., & Oliver, M. (2010). Student engagement and blended learning: portraits of

risk. Computers & Education, 54(3), 693-700.

Howell, J., Miller, P., Park, H. H., Sattler, D., Schack, T., Spery, E., Widhalm, S., &

Palmquist, M. (2013). Reliability and validity. Writing@CSU. Colorado State


114
University. Retrieved from

https://writing.colostate.edu/guides/guide.cfm?guideid=66

Hoy, W. K., & Adams, C. M. (2015). Quantitative research in education: A primer. Sage

Publications.

Huang, W. D., Hood, D. W., & Yoo, S. J. (2012). Gender divide and acceptance of

collaborative Web 2.0 applications for learning in higher education. Internet and

Higher Education, 16, 57-65.

Huang, R., & Zhou, Y. (2006). Designing blended learning focused on knowledge

category and learning activities. The handbook of blended learning: global

perspectives, local designs, 296-310.

Hughes, G. (2007). Using blended learning to increase learner support and improve

retention. Teaching in Higher Education, 12(3), 349-363.

Hughey, A. W. (2000). Application of the Deming philosophy to higher

education. Industry and Higher Education, 14(1), 40-44.

Hwang, G., Huang, T., & Tseng, J. (2004). A group-decision approach for evaluating

educational web sites. Computers & Education, 42(1), 65-86.

Iowa-State University (2010). Course evaluation survey in blended learning. ANR

Program Evaluation.

Iqbal, A., Kokash, H., & Al-Oun, S. (2011). The impact assessment of demographic

factors on faculty commitment in the Kingdom of Saudi Arabian universities.

Journal of College Teaching & Learning, 8(2), 1-7.


115
Jee, R., & O’Connor, G. (2014). Evaluating the impact of blended learning on

performance and engagement of second language learners. International Journal of

Advanced Corporate learning, 7(3), 12-16.

Jen-Her, W., Robert, D. T., & Tzyh-Lih H. (2010). A study of student satisfaction in a

blended e-learning system environment. Computer and Education, 55(1), 155–164.

Johnson, B., & Christensen, L. (2008). Educational research: Quantitative, qualitative,

and mixed approaches. Thousand Oaks, CA: Sage.

Johnson, L., Adams, S., Cummins, M., Estrada, V., Freeman, A., & Hall, C. (2012). The

NMC Horizon Report 2012 Higher Education Edition. The New Media Consortium.

Austin, Texas.

Jones, S., Johnson-Yale, C., Millermaier, S., & Pérez, F. S. (2009). U.S. college students’

Internet use: Race, gender and digital divides. Journal of Computer-Mediated

Communication, 14, 244-264.

Julio CaberoAlmenara, María Del Carmen LlorenteCejudo, & Angel Puentes Puente.

(2010). La satisfacción de losestudiantesen red en la formaciónsemipresencial

Online Students´ Satisfaction with Blended Learning. Comunicar, 35, 149-157.

Kaur, M. (2013). Blended learning-its challenges and future. Procedia-Social and

Behavioral Sciences, 93, 612-617.

Kenney, J., & Newcombe, E. (2011). Adopting a blended learning approach: Challenges

encountered and lessons learned in an action research study. Journal of

Asynchronous Learning Networks, 15(1), 45-57.


116
Kerres, M., & Witt, C. (2003). A didactical framework for the design of blended learning

arrangements. Journal of Educational Media, 28(2-3), 101-113.

Khan, B. H. (2005). Managing e-learning: Design, delivery, implementation, and

evaluation. IGI Global.

Kim, W. (2007, August). Towards a definition and methodology for blended learning.

In The Proceedings of Workshop on Blended Learning (pp. 1-8).

King, G., Keohane, R. O., & Verba, S. (1994). Designing social inquiry: Scientific

inference in qualitative research. Princeton university press.

Kinshuk, D., & Yang, A. (2003). Web-based asynchronous synchronous environment for

online learning. United States Distance Education Association Journal, 17 (2)

(2003), pp. 5–17.

Kintu, M., Zhu, J., & Kagambe, C. (2017). Blended learning effectiveness: The

relationship between student characteristics, design features and outcomes.

International Journal of Educational Technology in Higher Education, 14(1), 1-20

Knight, J. E., & Allen, S. (2012). Applying the PDCA Cycle to the complex task of

teaching and assessing public relations writing. International Journal of Higher

Education, 1(2), 66-83.

Köse, U. (2010). A blended learning model supported with Web 2.0

technologies. Procedia-Social and Behavioral Sciences, 2(2), 2794-2802.

http://dx.doi.org/ 10.1016/j.sbspro.2010.03.417.

Krasnova, Tatiana, &Demeshko, Maria. (2015). Tutor-mediated Support in Blended

Learning. Procedia - Social and Behavioural Sciences, 166, 404-408.


117
Krueger, R. A., & Casey, M. A. (2014). Focus groups: A practical guide for applied

research. Sage publications.

Kudrik, Y., Lahn, L. C., & Mørch, A. I. (2009). Technology-enhanced workplace

learning: Blended learning in insurance company. In 17th International Conference

on Computers in Education. Hong Kong: Asia-Pacific Society for Computers in

Education.

Kvavik, R. B. (2005). Convenience, communications, and control: How students use

technology. Educating the net generation, 1(2005), 7-1.

Kwak, D., Menezes, F., & Sherwood, C. (2015). Assessing the Impact of Blended

Learning on Student Performance. Economic Record, 91(292), 91-106.

Larson, D. K., & Sung, C. H. (2009). Comparing student performance: Online versus

blended versus face-to-face. Journal of Asynchronous Learning Networks, 13(1), 31-

42.

Laurillard, D. (2002). Rethinking university teaching: A conversational framework for the

effective use of learning technologies. Routledge.

Lebow, D. (1993). Constructivist values for instructional systems design: five principles

toward a new mindset. Educational Technology Research and Development, 41 (3)

(1993), pp. 4–16.

Lee, G., Fong, W. W., & Gordon, J. (2013, August). Blended learning: The view is

different from student, teacher, or institution perspective. In International

Conference on Hybrid Learning and Continuing Education (pp. 356-363). Springer,

Berlin, Heidelberg.
118
Levin, D., & Arafeh, S. (2002). The digital disconnect: The widening gap between

Internet-savvy students and their schools.

Li, Z. Tsai, M-H., Tao, J., & Lorentz, C. (2014). Switching to blended learning: The

impact on students’ academic performance. Journal of Nursing Education and

Practice, 4(3), 245-251.

Lim, D. H., and Kim, H. J. (2003). Motivation and learner characteristics affecting online

learning and learning application. Journal of Educational Technology Systems,

31(4), pp. 423–439.

Lim, D. H, & Morris, M. L. (2009). Learner and instructional factors influencing learning

outcomes within a blended learning environment. Educational Technology &

Society, 12(4), pp. 282–29.

Lin, Q. (2009). Student views of hybrid learning: a one-year exploratory study. Journal

of Computing in Teacher Education, 25(2), 57-66.

Littlejohn, A., & Pegler, C. (2007). Preparing for blended e-learning. Abingdon,

England: Routledge.

Lopez-Perez, M. Victoria, Perez-Lopez, M. Carmen, & Rodriguez-Ariza, Lazaro. (2011).

Blended Learning in Higher Education: Students' Perceptions and Their Relation to

Outcomes. Computers & Education, 56(3), 818-826.

Luke, C. (2003). Pedagogy, connectivity, multimodality, and interdisciplinary. Reading

Research Quarterly, 38(3), 397-403.

Lundvall, B. Å. (2010). National systems of innovation: Toward a theory of innovation

and interactive learning (Vol 2). Anthem press.


119
Maki, R. H., Maki, W. S., Patterson, M., & Whittaker, P. D. (2000). Evaluation of a web-

based introductory psychology course: Learning and satisfaction in on-line versus

lecture courses. Behaviour Research Models, Instruments, and Computers, 32(2000),

pp. 230–239.

Martinez-Torres, M. R., Toral Marin, S. L., Garcia, F. B., Vazquez, S. G., Oliva, M. A.,

& Torres, T. (2008). A technological acceptance of e-learning tools used in practical

and laboratory teaching, according to the European higher education area. Behaviour

and Information Technology, 27(6), 495-505.

Masi, A., & Winer, L. (2005). A university-wide vision of teaching and learning with

information technologies. Innovations in Education and Teaching International,

42(2), 147–155.

Masie, E. (2006). The blended learning imperative. The handbook of blended learning:

Global perspectives, local designs, 22-26.

Matulich, E., Papp, R., & Haytko, D. L. (2008). Continuous improvement through

teaching innovations: A requirement for today's learners. Marketing Education

Review, 18(1), 1-7.

McClue, B., Esmail, A., & Eargle, L. (2006). Assessing effective online instruction sites.

Academic Exchange Quarterly, 10(4), 95-101.

McLaughlin, J. E., Gharkholonarehe, N., Khanova, J., Deyo, Z. M., & Rodgers, J. E.

(2015). The impact of blended learning on student performance in a cardiovascular

pharmacotherapy course. American Journal of Pharmaceutical Education, 79(2), 24.


120
Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of

evidence-based practices in online learning: A meta-analysis and review of online

learning studies.

Mebane, M., Porcelli, R., Iannone, A., Attanasio, C., & Francescato, D. (2008).

Evaluation of the efficacy of affective education online training in promoting

academic and professional learning and social capital. International Journal of

Human-Computer Interaction, 24(1), 68–86.

Melton, B. F., Bland, H. W., & Chopak-Foss, J. (2009). Achievement and satisfaction in

blended learning versus traditional general health course designs. International

Journal for the Scholarship of Teaching and Learning, 3(1), 26.

Mertens, D. M. (2014). Research and evaluation in education and psychology:

Integrating diversity with quantitative, qualitative, and mixed methods. Sage

publications.

Miclat, E. F. (2005). Strategic Planning in Education: Making Change Happen. Rex

Bookstore, Inc.

Milgram, L., Spector, A., & Treger, M. (1999). Plan, Do, Check, Act: The Deming or

Shewhart Cycle. Managing Smart, 25.

Milligan, W. W. (2010). Information Technology at Michigan Tech: 2010 Survey Results

and Discussion.

Minichiello, V., Aroni, R., & Hays, T. (2008). In-depth interviewing: Principles,

techniques, analysis. Pearson Education Australia.

Moen, R. D., & Norman, C. L. (2010). Circling back. Quality Progress, 43(11), 22.
121
Moore, K. M. (2011), Handbook of distance education. New York, NY: Routledge.

National Commission for Academic Accreditation and Assessment (NCAAA). (2009).

National qualifications framework form higher education in the Kingdom of Saudi

Arabia. NCAAA, Riyadh, Saudi Arabia.

National Commission for Academic Accreditation and Assessment (NCAAA). (2010).

Quality Assurance and Accreditation in Saudi Arabia. NCAAA, Riyadh, Saudi

Arabia.

National Learning Consortium. (2013). Continuous quality improvement (CQI) strategies

to optimize your practice. Health Information Technology Research Center

(HITRC).

Norberg, A., Dziuban, C., & Moskal, P. (2011). A time based blended learning model.

On the Horizon, 19(3), 207-216. http://dx.doi.org/10.1108/10748121111163913

Nunan, T., George, R., &McCausland, H. (2000). Rethinking the ways in which teaching

and learning are supported: the flexible center at the University of South Australia.

Journal of Higher Education Policy and Management, 22(1), pp. 85–98.

Onwuegbuzie, A. J., Jiao, Q. G., & Bostick, S. L. (2004). Library Anxiety: Theory,

Research, and Applications. Scarecrow Press.

Ornstein, M. (2013). A companion to survey research. Sage Publications, ISBN

9781446209097, 141-142.

Osguthorpe, R. T., & Graham, C.R. (2003) Blending learning environments: Definitions

and directions. The Quarterly Review of Distance Education, 4(3), pp. 227–233.
122
O’Toole, J. M., & Absalom, D. J. (2003). The Impact of Blended Learning on Student

Outcomes: is there room on the horse for two? Journal of Educational Media, 28(2–

3), 179–190. https://doi.org/10.1080/1358165032000165680

Paechter, M., Maier, B., & Macher, D. (2010). Students' Expectations of, and

Experiences in E-Learning: Their Relation to Learning Achievements and Course

Satisfaction. Computers & Education, 54(1), 222-229.

Palloff, R. M., Pratt, K., & Stockley, D. (2001). Building learning communities in

cyberspace: Effective strategies for the online classroom. The Canadian Journal of

Higher Education, 31(3), 175.

Park, S., Hironaka, S., Carver, P., & Nordstrum, L. (2013). Continuous Improvement in

Education. Advancing Teaching--Improving Learning. White Paper. Carnegie

Foundation for the Advancement of Teaching.

Paturusi, S., Usagawa, T., & Lumenta, A. (2016, October). A study of students'

satisfaction toward blended learning implementation in higher education institution

in Indonesia. In Information & Communication Technology and Systems (ICTS),

2016 International Conference on (pp. 220-225). IEEE.

Piccoli, G., Ahmad, R., & Ives, B. (2001) Web-based virtual learning environments: A

research framework and a preliminary assessment of effectiveness in basic it skills

training. MIS Quarterly, 401–426.

Picciano, A. G. (2009). Blending with purpose: the multimodal model. Journal of the

Research Center for Educational Technology, 5(1), 4-14.


123
Radford, A. (1997). The future of multimedia in education. First Monday, 2(11).

Rahman, N., Hussein, N., & Aluwi, A. (2015). Satisfaction on blended learning in a

public higher education institution: What factors matter? Procedia - Social and

Behavioral Sciences, 211, 768-775.

Reasons, S., Valadares, K., & Slavkin, M. (2005). Questioning the hybrid model: Student

outcomes in different course formats. Journal of Asynchronous Learning Networks,

9(1), 83-94.

ReVelle, J. B. (2004). Quality essentials: A reference guide from A to Z. ASQ Quality

Press.

Riffenburgh, R. H. (2012). Statistics in Medicine. Academic Press, 3, 573-574.

Rogers, E. (2003). The Diffusion of Innovations. New York: The Free Press.

Ross, B., & Gage, K. (2006). Global perspectives on blending learning. BonkJ. C.

GrahamR. C.(Eds.), The handbook of blended learning, 155-168.

Rossett, A., Douglis, F., & Frazee, R. V. (2003). Strategies for building blended

learning. Learning circuits, 4(7), 1-8.

Rossi, P. H., Wright, J. D., & Anderson, A. B. (2013). Handbook of survey research.

Academic Press.

Sajid, M., Laheji, A., Abothenain, F., Salam, Y., Aljayar, D., & Obeidat, A. (2016). Can

blended learning and the flipped classroom improve student learning and satisfaction

in Saudi Arabia? International Journal of Medical Education, 7, 281.


124
Santhanam, R., Sasidharan, S., & Webster, J. (2008). Using self-regulatory learning to

enhance e-learning-based information technology training. Information Systems

Research, 19, 26–47.

Scott, P., Gallacher, J., & Parry, G. (2017). New languages and landscapes of higher

education. Oxford University Press.

Shedletsky, L., & Aiken, J. E. (2001). The paradoxes of online academic work.

Communication Education, 50(3), 206-217.

Shewhart, W. A. (1939). Statistical method from the viewpoint of quality control. New

York: Reprint, 1986, with introduction by W. Edwards Deming. Dover Publications.

Shirky, C. (2008). Here comes everybody: The power of organizing without

organizations. New York: Penguin.

Shokraiefard, A. (2011). Continuous Quality Improvement in Higher Education. A Case

Study in Engineering School of Boras University. Master of Science with a Major in

Quality and Environmental Management.

Sher, A. (2009). Assessing the relationship of student-instructor and student-student

interaction to student learning and satisfaction in Web-based online learning

environment. Journal of Interactive Online Learning, 8(2).

Singh, H. (2003). Building effective blended learning programs. Educational

Technology-Saddle Brook Then Englewood Cliffs NJ-, 43(6), 51-54.

Small, F., Dowell, D., & Simmons, P. (2012) Teacher communication preferred over peer

interaction: Student satisfaction with different tools in a virtual learning

environment. Journal of International Education in Business, 5(2), 114 – 128.


125
Smart, K. L., & Cappel, J. J. (2006). Students’ perceptions of online learning: A

comparative study. Journal of Information Technology Education, 5, 201-219.

Retrieved from http://www.jite.org/documents/Vol5/v5p201-219Smart54.pdf

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social

presence and satisfaction in a blended learning environment: Relationships and

critical factors. Computers & Education, 51, 318–336.

Sokovic, M., Pavletic, D., & Pipan, K. K. (2010). Quality improvement methodologies–

PDCA cycle, RADAR matrix, DMAIC and DFSS. Journal of achievements in

materials and manufacturing engineering, 43(1), 476-483.

Stacey, E., & Gerbic, P. (2008). Success factors for blended learning. Hello! Where are

you in the landscape of educational technology? Proceedings ascilite Melbourne

2008, 964-968.

Staker, H., & Horn, M. B. (2012). Classifying K-12 blended learning. Innosight Institute.

Strobl, J. (2007). Geographic learning. Geoconnexion International Magazine, 6(5). 46-

47. http://www.geoconnexion.com/publications/geo-international/

Squires, A., & Cloutier, R. (2011). Applying the Plan‐Do‐Check‐Act Cycle to Develop

Best Practices in Remote Online Systems Engineering Education. In INCOSE

International Symposium, 21(1), 1211-1223.

Tam, M. (2000). Constructivism, instructional design, and technology: Implications for

transforming distance learning. Educational Technology & Society, 3(2), 50-60.


126
Taradi, S., Taradi, M., Radić, K., & Pokrajac, N. (2005). Blending problem-based

learning with web technology positively affects student-learning outcomes in acid-

base physiology. Advances in Physiology Education, 29(1), 35-39.

Taylor, M. J., McNicholas, C., Nicolay, C., Darzi, A., Bell, D., & Reed, J. E. (2013).

Systematic review of the application of the plan–do–study–act method to improve

quality in healthcare. BMJ Qual Saf, bmjqs-2013.

Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research:

integrating quantitative and qualitative approaches in the social and behavioural

sciences. Thousand Oaks, CA: Sage.

The W. Edwards Deming Institute. (2018). PDSA. Retrieved from

https://deming.org/explore/p-d-s-a

Thomas, H. F., Simmons, R. J., Jin, G., Almeda, A. A., & Mannos, A. A. (2005).

Comparison of student outcomes for a classroom-based vs. an internet-based

construction safety course. The Journal of SH&E Research, 2(1), 1-15.

Thorne, K. (2003). Blended learning: how to integrate online & traditional learning.

Kogan Page Publishers.

Toth, E., Morrow, L., & Ludvico, R. (2009). Designing blended inquiry learning in a

laboratory context: A study of incorporating hands-on and virtual laboratories.

Innovative Higher Education, 33(5), 333-344.

Twigg, C. A. (2003). Improving learning and reducing costs: Lessons learned from round

I of the PEW grant program in course redesign. Center for Academic

Transformation, Rensselaer Polytechnic Institute. Toy: NY.


127
Tyler, L. (2005). ICT literacy: Equipping students to succeed in an information-rich.

Technology-based society.

UNESCO. (2002). UNESCO report: ICTs in teacher education: A planning guide.

UNESCO

US Department of Education. (2017). Reimagining the Role of Technology in Higher

Education: A Supplement to the National Education Technology Plan. Office of

Educational Technology, Washington, D.C. Retrieved from

https://tech.ed.gov/files/2017/01/Higher-Ed-NETP.pdf

Vaughan, N. (2007). Perspectives on blended learning in higher education. International

Journal on E-learning, 6(1), 81-94.

Voos, R. (2003). Blended learning: What is it and where might it take us. Sloan-C

View, 2(1), 2-5.

Waha, B., & Davis, K. (2014). University students’ perspective on blended

learning. Journal of Higher Education Policy and Management, 36(2), 172-182.

Wang, Y. S. (2003). Assessment of learner satisfaction with asynchronous electronic

learning systems. Information & Management, 41(1), 75-86.

Wang, Q., & Huang, C. (2018). Pedagogical, social and technical designs of a blended

synchronous learning environment. British Journal of Educational

Technology, 49(3), 451-462.

Warschauer, M., & Liaw, M. L. (2010). Emerging Technologies in Adult Literacy and

Language Education. National Institute for Literacy.


128
Wheeless, J. M. (2009). Using PDSAs to improve student achievement and being pretty

darn successful at it. ASQ Higher Education Brief.

Woltering, V., Herrler, A., Spitzer, K., &Spreckelsen, C. (2009). Blended learning

positively affects students’ satisfaction and the role of the tutor in the problem-based

learning process: results of a mixed-method evaluation. Advances in Health Science

Education, 14, 725–738.

Woods, R., Baker, J. D., & Hopper, D. (2004). Hybrid structures: Faculty use and

perception of web-based courseware as a supplement to face-to-face instruction. The

Internet and Higher Education, 7(4), 281-297.

Woodside, A., & Biemans, W. (2005). Managing relationships, networks, and complexity

in innovation, diffusion, and adoption processes. Journal of Business & Industrial

Marketing, 20(7), 335-339.

Wu, J., & Liu, W. (2013). An empirical investigation of the critical factors affecting

students' satisfaction in EFL blended learning. Journal of Language Teaching &

Research, 4(1).

Wu, Jen-Her, Tennyson, Robert D., & Hsia, Tzyh-Lih. (2010). A study of student

satisfaction in a blended e-learning system environment. Computers & Education,

55(1), 155-164.

Wu, J. H., Tennyson, R. D., Hsia, T. L., & Liao, Y. W. (2008). Analysis of E-learning

innovation and core capability using a hypercube model. Computers in Human

Behavior, 24(5), 1851-1866.


129
Yang, Z., & Liu, Q. (2007). Research and development of web-based virtual online

classroom. Computers & education, 48(2), 171-184.

Yuen, A. H., Deng, L., Fox, R., & Tavares, N. J. (2009, August). Engaging students with

online discussion in a blended learning context: issues and implications.

In International Conference on Hybrid Learning and Education (pp. 150-162).

Springer, Berlin, Heidelberg.

Žuvić-Butorac, M., Rončević, N., Nemčanin, D., & Nebić, Z. (2011). Blended e-learning

in higher education: Research on students’ perspective. Issues in Informing Science

and Information Technology, 8, 409-429.


APPENDICES
APPENDIX A:

BLENDED COURSE STUDENT SURVEY


132
(This survey was adapted from Blended Learning Student Survey done by Long Island
University (n.d.). Retrieved from:
http://www.liu.edu/~/media/Files/AcademicAffairs/Outcomes/Blended%20Learning%20
Student%20Survey.ashx?la=en).
Please answer the following questions as clearly as you can by checking the box or line,
as appropriate. BLENDED courses have some face-to-face class meetings, but also have
some class sessions that are replaced with online instruction.
Age: _____ Gender: _____
Please indicate your level of satisfaction with the following statements:

Very Generally Neither Generally Very


satisfied Satisfied 3 dissatisfied dissatisfied
5 4 2 1
1. I am satisfied with the level of
effort this course required.
2. I am satisfied with my
performance in this course.
3. Compared to face-to-face course
settings, I am satisfied with this
learning experience.
4. I am satisfied with the way I
interact with other students.
5. I am satisfied with my
participation in the class.
6. I am satisfied with the process of
collaboration activities during the
course.
7. In general, how satisfied were you
with your blended course(s)?
Please indicate your level of agreement with the following statements:

Strongly Agree Neutral Disagree Strongly


agree disagree
4 3 2
5 1

8. I’m more likely to ask questions


in a blended course
9. There are more opportunities to
collaborate with others in a
blended course
10. My blended course experience
has increased my opportunity to
access and use information
11. Blended learning helps me better
understand course material
133
12. The use of blended learning
technology in this course
encourages me to learn
independently.
13. The online components of this
course effectively reinforced what
I was learning in the face-to-face
sessions of this course
14. My performance in exams is
improved compared to similar
courses I studied before.
15. I enjoy working on assignments
by myself.
16. Generally, I understand course
requirements better in a blended
course
17. The instructor made me feel that I
am a true member of the class.
18. The instructor used Moodle
technology appropriately.
19. The instructor clearly
communicated important due
dates/time frame for the activities.
20. The instructor provided clear
instructions on how to participate
in the course activities.
21. The instructor encouraged
students to explore new concepts
in the course
22. Feedback on evaluation of tests
and other assignments was given
in a timely manner.
23. A blended course format helps
me to manage my time better
24. I am satisfied with the
accessibility and availability of
the instructor.
25. I attend to all the activities offered
through Moodle (Quiz,
assignment, videos, etc.) the same
way I attend face-to-face classes.
26. Technical problems are not
frequent and they do not
adversely affect my understanding
of the course.
27. The technology used for blended
teaching (Moodle) is reliable.
28. Course content shown or
displayed using Moodle is clear.
29. Generally, I am more engaged in
my blended courses
134
In comparison to the interaction experienced with students and instructors in traditional face-to-face
courses, how would you describe the amount and quality of interaction experienced with . . .?

Much A little About the A little Much


better better same worse worse
5 4 3 2 1

30. The amount of your interaction


with other students
31. The quality of your interaction
with other students
32. The amount of your interaction
with the instructor
33. The quality of your interaction
with the instructor
34. A blended learning course makes
it more important for students to
visit the lecturer during office-
hours.
NOT all I find this I find this I find this I do not
tool were tool easy tool hard tool useful find this
used for to use to use tool useful
our
course

35. How would you describe your


experience on the tools in Moodle
(lesson files, online tasks, Forums,
Quizzes, surveys)
Too light Light Moderate Heavy Too
Heavy

36. Compared to your other courses


was the workload in this blended
learning course/s

1. Which class modality do you prefer? (Please select one of the following by

putting √ )

----- Entirely face-to-face

----- Minimal use of the Web, mostly held in face-to-face format

----- An equal mix of face-to-face and web content

----- Extensive use of the Web, but still some face-to-face class time

----- Entirely online with no face-to-face time


135
2. What was the most effective area of this blended learning course?

3. What was the least effective area of this blended learning course?

4. What advice would you give to a student considering a blended learning course

for the first time?


APPENDIX B:

CONSENT TO ACT AS A RESEARCH SUBJECT


137
California State University, Dominguez Hills

Consent to Act as a Research Subject

EMPLOYING THE PDSA CYCLE TO IMPROVE THE QUALITY OF

BLENDED LEARNING IN BASIC MEDICAL SCIENCE COURSES

Introduction

This study asks about your experience with the blended learning system used in our

college. Your responses will help develop information for others considering the quality

of the current blended system. This study will use a written survey questionnaire to learn

about the students’ experiences with blended learning. Questions will develop an

understanding of the effectiveness of the current blended learning system and opinions

about future requirements to enhance the quality of the current system.

You are being asked to participate in a research study. Before you give your consent to

volunteer, it is important that you read the following information and ask as many

questions as necessary to be sure that you understand what you will be asked to do.

Investigators:

Co-investigator: Areig Yahya Al-Ramadin, MSQA, College of Extended Education at

California State University, Dominguez Hills. Research Supervisor: Adjunct Professor

Keith A. Fulton, Quality Assurance Program, College of Extended & International

Education at California State University, Dominguez Hills.

Purpose of the Study: The purpose of the study is to provide blended learning

stakeholders with significant data concerning the effectiveness of the current learning

management system (Moodle) and assist instructors in developing a quality framework to


138
advocate the BL approach. This study will contribute to the development of pedagogy

and the Learning Management System (LMS) in the private sector in Saudi universities.

Ultimately, this research may be published as a scholarly work and might be presented as a

paper.

Description of the Study:

The students will be asked to complete a three-page paper survey of about 35 multi-part

questions. There will be no interview.

Risks or Discomforts:

There are no reasonably foreseeable or expected risks.

Benefits of the Study:

The results of this study may allow REU to revise the current blended learning system to

conform more closely to students’ preferences, which are available from the results of

this survey. There is no guarantee, however, that the participants will receive any

benefits from participating in this study.

Confidentiality:

The data collected from this survey do not contain names, emails or personal

identification. Confidentiality will be maintained to the extent allowed by law.

Incentives to Participate:

There will be no payment, compensation or incentives incurred by and for the people

involved in the study.

Voluntary Nature of Participation:


139
Participation in this study is voluntary. Your choice of whether or not to participate will

not influence your future relations with the university or the course instructor. If you

decide to participate, you are free to withdraw your consent and stop your participation at

any time without penalty or loss of benefits to which you are entitled.

Questions about the Study: If you have any questions about this study or your rights as a

participant, you may call the investigator Areig Yahay Al-Ramadin at

ramadin@riyadh.edu.sa or by telephone at 056 459 0665 or the Institutional Review

Board for the Protection of Human Subjects at CSUDH, 310-243-3756.

Your signature below indicates that you have read the information in this document and

have had a chance to ask any questions you may have about the study. Your signature

also indicates that you agree to be in the study and have been told that you can change

your mind and withdraw your consent at any time. You have been given a copy of this

consent form. You have been told that by signing this consent form you are not giving up

any of your legal rights.

Name of Participant (please print)

Signature of Participant Date

Areig Yahya Al-Ramadin


Signature of Investigator Date

Subject recruitment and data collection may not be initiated prior to formal written approval from the
California State University, Dominguez Hills Institutional Review Board
APPENDIX C:

RIYADH ELM UNIVERSITY IRB APPROVAL LETTER


141

You might also like