Professional Documents
Culture Documents
Submitted By:Group 6
Den ZesTer B.Mabasa (GROUP LEADER)
Members:
Angelica D.Tagudin
Diana Ara Dela Cruz
Kezia Jemima Parchamento
Mae Gragasin
John Irvin Dela Cruz
Michaela Mae Padua
Ryan Jay Sindanum
Justine Damance
CHAPTER 1
THE PROBLEM AND IT’S BACKGROUND
Introduction
The world is going through tough times with the Covid-19 pandemic. Inevitably,
there has been severe impacts on education systems around the globe. Schools and
universities were closed, and millions of kids, adolescents and young adults have
been out of schools and universities. Nichols (2011) pointed out that the Internet
could be seen as (i) another delivery medium, (ii) as a medium to add value to the
existing educational transaction or, (iii) as a way to transform the teaching and
learning process. The research and discourse surrounding quality of online learning
provisions, student engagement and satisfaction has been ongoing by both
proponents and opponents of online learning (Biner et al. 2013; Rienties et
al. 2018 ). With the abrupt shift and uptake of online learning, due to the Covid-19
pandemic, such discourse finds its relevance much beyond the classic academic
research and debate. It is linked to the future of teaching and learning in
technology-enabled learning environments. Arguably, the adoption of technology
has disrupted the traditional teaching practices as teachers often find it difficult to
adjust and connect their existing pedagogy with technology (Sulisworo 2013 ).
Similarly, if informed policy decisions are not taken, this can affect the knowledge
transfer processes as well as reduce the efficiency of teaching and learning
processes (Ezugwu et al. 2016 ).
One of the challenges of online digital learning relates to students’ learning
experiences and achievement. Sampson et al. (2014) stated that students’
satisfaction and outcomes are good indicators for assessing the quality and
effectiveness of online programs. It is of concern for institutions to know whether
its students, in general, are satisfied with their learning experience (Kember and
Ginns 2012 ). Another essential element for quality online education is learner
engagement. Learner engagement refers to the effort the learner makes to promote
his or her psychological commitment to stay engaged in the learning process, to
acquire knowledge and build his or her critical thinking (Dixson 2015 ). While there
are different conceptualisations of student engagement (Zepke and Leach 2019),
advocates of learning analytics tend to lay emphasis on the analysis of platform
access logs including clicks on learning resources when it comes to student
engagement in online learning (Rienties et al. 2018 ). The proposition is that being
active online through logins, active sessions and clicks actually reflects actual
engagement in an online course and result in better student performances. However,
this model mainly works in classic online modules, and there is limited availability
of literature measuring students’ engagement in activity-based hybrid learning
environments where there is a mix online and offline activities (Rajabalee et
al. 2020 ).
The purpose of current study was to explore the effect of online digital modules on
academic performance of the students at higher education level and This research
aims to identify the impact of the effectiveness of having Digitalized Modules to
Academic Performance of Grade 12 HUMSS senior high school students, and how to
addressed this study.
1.What are the positive and negative effect of online digital modules to students?
2. What are the implications of online digital modules to academic performance of
Grade 12 HUMSS Students?
Conceptual Framework
In this research study,The researcher will use conceptual framework of the study to
shows the input,process,output(IPO)model of the overall outcome of the study on
what is the effect of online digital modules to academic performance of Grade 12
HUMSS students at Rizal National High School.
Theoretical Framework.
The study is anchored on Implications of Digitized Modules to Academic
Performance of Grade 12 HUMSS. Collaborative learning through structured
blending of online tutorials, and lecture supplemented with Socratic dialogue, role-
based group assignments and other similar activities seems to be a viable option in
the context of the Rizal Senior High School (RNHS-SHS). The not very successful
blending requires an understanding of the pedagogical attributes and affordances of
new and emerging learning technologies, the most desirable aspects of face-to-face
teaching and the ways in which these aspects can be appropriately integrated as
discussed in the following sections. Therefore, this chapter discusses the scope of
these technologies in Higher Education (HE) from various perspectives, itspotential
impact on the net generation students, its affordances in student learning
andresearch, distinct features of learning and blended learning, how these modes of
delivery compare with traditional face-to-face approaches, and their benefits to
highereducation as well as the challenges they pose. This chapter also helps to
gainunderstanding of the conditions under which the enabling potential of
technology willbe realised and further, establish the purpose of this study.In order
to establish the rationale of placing blended learning at the core of this study,this
chapter proposes a theoretical framework that serves as the foundation for the study.
A statement According to Moore (2015), factors such as the use of learning
strategies, learning difficulties, peer-tutor support, ability to apply knowledge and
achievement of learning outcomes indicate those elements that impact on the overall
satisfaction of students in online learning. A learning strategy is a set of tasks
through which learners plan and organize their engagement in a way to facilitate
knowledge acquisition and understanding (Ismail 2018 ). Enhancing the learning
process with appropriate learning strategies may contribute to better outcomes and
performances (Thanh and Viet 2016 ). Aung and Ye (2016 ) reported that students’
success and achievement were positively related to student satisfaction .
Definition of Terms
The following terms were defined for clarity and common understanding of this
study:
Gadget- Is a small mechanical or electronic device or tool, especially an ingenious
or novel one.
Internet - Is a global computer network providing a variety of information and
communication facilities, consisting of interconnected networks using standardized
communication protocols.
Implication- Is a conclusion that can be drawn from something although it is not
explicitly stated.
Module- Refers to a learning sheets that the students will study and understand to
have some knowledge on their lessons.
Negative Effect of Online Digital Module -The way that the Online Digital Module
has been inveted a lot of negative feedback came from the students and still tends to
struggle with student feedback. Students completing regular assessments become
dissatisfied when they experience a lack of personalized feedback. The traditional
methods of providing student feedback don’t always work in an Online Digital
Module environment, and because of this, online digital module providers are
forced to look towards alternative methods for providing feedback. Providing
student feedback in an online setting is still a relatively unresearched topic area,
and it might take a while for any specific strategies to become fully research-based
and proven to be effective.
Online Digital Module- Is the term that is most often used to describe online
lessons or units. Online Digital Modules typically contain content and activities
organized to
Positive Effect Of Online Digital Learning- A well-thought-out Online Digital
Module can benefit everyone.At first glance,that might seem like
hyperbole.However in reality,Online Digital Module is just that effective.While you
might assume that online digital modules serves the learner primarily,the benefits of
online digital module are more far more wide-reaching.And also Online Digital
Module can improve the performance of individuals,business,and society by
promoting vital,future-ready-skills such as communication,collaboration,technology
and education.
Self-Learning Module (SLM)- Is designed to help students understand learning
from the psychological standpoint. It focuses on the various approaches used in
explaining how and why people learn and explores how these theories would apply
to the teaching-learning process. 3. Need and significance of the study • Self
learning modules are designed to provide with a solid knowledge base and actualize
learning experiences.
Research Hypothesis
Research Instrument
In this study the instrument that was utilized in this study was a survey
questionnaire in collecting and gathering data.The questionnaire is the main
common tool when collecting primary data.A questionnaire allows the data to be
collected in a systematized way,therefore the data are initially consistent and
coherent for analysis.A Questionnaire must have an absolute purpose that is related
to the research key objectives,and it must be clear from the start how the findings
will be used.(Roopa &Rani,2012).
The research Questionnaire is divided into (3) parts,The first part of the
questionnaire contains the demographic profile of the respondents.The questionnaire
in part (2) has some multiple choices to pick.And the (3) part of the questionnaire
contains question and answers,regarding on what is the positive and negative effect
of Online Digital Module on the academic performance of the Grade 12 Students at
Rizal National High School.
Development and Administration of Research Instrument
For this study,the conduction of the research involved the use of self structured
survey questionnaire,which has questions and answers see that has been used as a
guide.The survey questions are divided based from what the variables or factors
used by the researchers to guide the respondents towards the satisfaction of the
research objectives.As for the distribution of the questions,the survey questions was
conducted through Google Forms for the participant who wants to answer it by
online.The part (1)of the research survey questionnaire was demegraphic profile for
each respondents that answers on the survey questions,and for the part (2) of the
research survey questionnaire the respondents will answer the seven questionnaire
with choices to fill. part(3) it contains questions with options to fill.The questions
is 3 only.
Data Gathering Procedure
In this study,the gathering of data for this research,the researchers will ask the
respondents permission first if they are available to answer the survey questionnaire
through google forms,The researchers gave a permission message to all the
participants to serve as a agreement that the participants will voluntary to take some
of their time to answer the question surveys that the researcher gave to them.And
after the participants finished answering the question surveys the researcher will get
the questions survey to the participants,after thye all the participant fined answering
the question the researche will interpret to the participants that the questions survey
that they been answered it will become the basis to gather the data and analysis of
the researcher,and so that the researcher will be able now to answer the statement of
the problem.
For the part 2and 3 the researcher will used the weighted mean,Weighted mean is
obtained by multiplying each value of the group by the appropriate weight factors
and will be divided by the total of number of respondents.The tool was used to
answer and get the data on the second and third part of the survey questionnaire.
Agree 3.00-3.75
Disagree 1.00-2.00
CHAPTER 4
PRESENTATION,INTERPRETATION,AND ANALYSIS OF DATA
Introduction
In this chapter, the research results and findings are presented. The findings are
used to answer the research objectives and questions in the earlier chapter of this
research. The data analysis part will explain the summary statistics, descriptive
analysis, correlation and regression analysis. At the end of this chapter, the overall
results are discussed in detail.
Descriptive Analysis
From total of 30 set of questionnaires distributed via online survey (email
invitation) to respondents from various sectors including the 3 sections of Grade 12
Humss students,A total of 30 responses were used for the analysis.
Respondents Profile
The summary of respondents profile is shown as table 16 below:
Characteristics Frequency Percentage
Female 20 70.0
Age 17 Years 15 70.0
18 Years 10 20.0
19 Years 5 10.0
Section 12 Humss A 10 33.33
12 Humss B 10 33.33
12 Humss C 10 33.33
Monthly Fees in 50-100 Pesos 15 50.0
Internet or load 1500-2000 Pesos 10 30.0
100-300 Pesos 5 20.0
Analysis of Measures
Figure 4 shows the mean and standard deviation for the responses on all
measurements that falls under the independent and dependent variables. The highest
mean score among all measurements is in the variable of Performance Expectancy
which average respondents are agree on Online Digital Module,(and what is the
possible effects on their academic performance). Next highest measurement that has
a mean score of 3.45 is in the variable of Online Digital Module on the academic
performance of Grade 12 Humss Students. It shows that the average respondents are
not interested on Online Digital Module.While Several respondents are slightly
agree.On the new way of learning. Details results for each questionnaire are as
Figure 4 below;
Part 2 of Research Ssurvey Questionnaire
Variables Description Of Mean Standard Deviation
Measures
Strongly Agree Is it possible for the 3.75 2.50
student to learn
anything in Online
Digital Module
Agree Is there wiil be 3.50 2.35
impact to the
Students the way of
Online Digital
Module Learning
Slightly Disagree Is it possible for 2.0 1.75
students to learn
anything in nine
subjects in one week
of study
Disagree Did you understand 3.75 2.50
the way of learning
on Online Digital
Module
Is online digital 3.0 1.50
module affects your
mental and physical
health
Testing Hypotheses
From the multiple regression analysis, the standardized coefficient (β) between
Strongly Agree (SA) and Strongly Disagree is 0.250 with p-value of 0.046 which is
significant at α equals to 0.04. Meanwhile, the standardized coefficient (β) between
agree (A) and slightly disagree is 0.167 with p-value of 0.000 which is significant at
α equals to 0.05. Hence, the result is supporting the first hypothesis (H1)that there
would be a positive and negative on the students academic performance.which the
strongly agree and disagree has been the same p value and same significant level.
In testing the second hypothesis; the standardized coefficient (β) between agree (A)
and Slightly Disagree is 0.013 with p-value of 0.046 which is significant at α equals
to 0.05. Meanwhile, the standardized coefficient (β) between Strongly Agree (SA)
and Disagree (D) is 0.118 with p-value of 0.034 which is significant at α equals to
0.05. Hence, the result is clearly not supporting the second hypothesis (H2) that
there are the implications of online digital modules to academic performance of
Grade 12 HUMSS Students.which the agree and slightly disagree has the same p
value and standard deviation.
From the multiple regression analysis, the standardized coefficient (β) between
Rewards Program (SA) and Slightly Disagree (SD) is 0.325 with p-value of 0.000
which is significant at α equals to 0.05. Meanwhile, the standardized coefficient (β)
between Agree(A) and Disagree(D) is 0.137 with pvalue of 0.016 which is
significant at α equals to 0.05. Hence, the result is supporting the third hypothesis
(H3) that compares there has significant effect on student academic performance of
Online Digital Module.which the Strongly agree and slightly disagree has the same
p value and standard deviation.
So all of the Three of the hypotheses are supported by the results, while balance one
hypothesis is rejected. The summary of the hypotheses results in this research is
presented in figure 7.
Hypotheses Result
(H1) There would be a positive and Supported
negative on the students academic
performance.which the strongly agree
and disagree has been the same p value
and same significant level.
Introduction
Base on the results obtained in Chapter 4, a discussion of the finding is presented
in this chapter. The findings from the study are used to discuss whether the
proposed hypotheses are supported. All research questions will be answered
subsequently and finally the achievement of research objectives are determined.
Summary of Findings
From the result of chapter 4,it is confirmed that there are three proposed
hypotheses.The three hypotheses tested and all of them are confirmly supported.The
summary of hypotheses testing result are shown in figure 7.
There are three major question in this research study,The first question is, What
are the positive and negative effect of online digital modules to students?As the
result in chapter 4,There are many positive and negative effects of Online Digital
Module to the academic performance of Grade 12 Hums students in rizal national
high school,The major one is according to their answers on research survey
questionnaire says that Online Digital Module affects their mental and physical
health,and on the other hand it affects their learnings and theuy cant understand the
lesson because of the lack of discuss.
The second question is, What are the implications of online digital modules to
academic performance of Grade 12 HUMSS Students?As the result in the previous
chapter the major implication of Online Digital Module to the academic
performance of Grade 12 Humss Students is the lack of good internet and lack of
discussion,While on the other hand of the answers says that it is so balance to learn
and study in online digital module.
And for the final question, Does online digital modules has significant effect on
student academic performance?As the result in chapter 4,There is significant effect
on student academic performance in Online Digital Module,based on their answers
in research survey questionnaire,there’s a lot of effects that the student get in
Online Digital Module and that is lack of internet and it affects their mental and
physical health.
Conclusion
This thesis goal is to determine the implication and effects of Online Digital
Module to the academic performance among Grade 12 Humss students at rizal
national high school,And answers those questions to determine the primary effects
of Online Digital Module to the academic performance of the Grade 12 Humss
Students in rizal national high school.So as the reseach survey has been finished the
researcher finds out that almost 60% of Grade 12 Humss students are totally affects
their academic performance on Online Digital Module,and on the other hand,40% of
the respondents says that online digital module is balance to learn and
understand.So at the conclusion of this research study there is a positive and
negative effects to the students academic performance on Online Digital
Module.Furthermore,it will be based o ]n students understanding and the way they
learn and understand in this new way of learning.
References:
Allen, M., Bourhis, J., & Burrell, N. (2002). Comparing student satisfaction with distance
education to traditional classrooms in higher education: A meta-analysis. American Journal of
Distance Education., 16(2), 83–97. https://doi.org/10.1207/S15389286AJDE1602_3.
Anderson, A., Huttenlocher, D., Kleinberg, J. & Leskovec, J. (2014). Engaging with massive
online courses. In Proceedings of the 23rd international conference on World wide web (pp. 687-
698). ACM. https://doi.org/10.1145/2566486.2568042.
Ashby, A., Richardson, J., & Woodley, A. (2011). National student feedback surveys in distance
education: An investigation at the UK Open University. Open Learning: The Journal of Open,
Distance and e-Learning., 26(1), 5–25. https://doi.org/10.1080/02680513.2011.538560.
Aung, J., & Ye, Y. (2016). The relationship between the levels of students’ satisfaction and their
achievement at Kant Kaw education Center in Myanmar. Scholar: Human Sciences, 8(1), 38
Retrieved from http://repository.au.edu/handle/6623004553/17994.
Biswas, A., Das, S., & Ganguly, S. (2018). Activity-based learning (ABL) for engaging
engineering students. In Industry Interactive Innovations in Science, Engineering and
Technology (pp. 601-607). Singapore: Springer. https://doi.org/10.1007/978-981-10-3953-9_58.
Biner, P., Dean, R., & Mellinger, A. (1994). Factors underlying distance learner satisfaction with
televised college-level courses. American Journal of Distance Education., 8(1), 60–
71. https://doi.org/10.1080/08923649409526845.
Cho, M., & Heron, M. (2015). Self-regulated learning: The role of motivation, emotion, and use
of learning strategies in students’ learning experiences in a self-paced online mathematics
course. Distance Education., 36(1), 80–99. https://doi.org/10.1080/01587919.2015.1019963.
Czerkawski, B. C., & Lyman, E. W. (2016). An instructional design framework for fostering
student engagement in online learning environments. TechTrends, 60(6), 532–
539. https://doi.org/10.1007/s11528-016-0110-z.
Dixson, M. (2010). Creating effective student engagement in online courses: What do students
find engaging? Journal of the Scholarship of Teaching and Learning, pp., 1–13 Retrieved
from https://scholarworks.iu.edu/journals/index.php/josotl/article/view/1744.
Dixson, M. (2015). Measuring student engagement in the online course: The online student
engagement scale (OSE). Online Learning, 19(4). https://doi.org/10.24059/olj.v19i4.561.
Dziuban, C., Moskal, P., Thompson, J., Kramer, L., DeCantis, G., & Hermsdorfer, A. (2015).
Student satisfaction with online learning: Is it a psychological contract? Online Learning,
19(2). https://doi.org/10.24059/olj.v19i2.496.
Earl-Novell, S. (2006). Determining the extent to which program structure features and
integration mechanisms facilitate or impede doctoral student persistence in
mathematics. International Journal of Doctoral Studies, 1, 52–55. https://doi.org/10.2894/560.
Ezugwu, A., Ofem, P., Rathod, P., Agushaka, J., & Haruna, S. (2016). An empirical evaluation
of the role of information and communication Technology in Advancement of teaching and
learning. Procedia Computer Science, 92, 568–577. https://doi.org/10.1016/j.procs.2016.07.384.
Fallon, E., Walsh, S., & Prendergast, T. (2013). An activity-based approach to the learning and
teaching of research methods: Measuring student engagement and learning. Irish Journal of
Academic Practice, 2(1), 2. https://doi.org/10.21427/D7Q72W.
Gelan, A., Fastré, G., Verjans, M., Martin, N., Janssenswillen, G., Creemers, M., Lieben, J.,
Depaire, B., & Thomas, M. (2018). Affordances and limitations of learning analytics for
computer-assisted language learning: A case study of the VITAL project. Computer Assisted
Language Learning., 31(3), 294–319. https://doi.org/10.1080/09588221.2017.1418382.
Gillett-Swan, J. (2017). The challenges of online learning: Supporting and engaging the isolated
learner. Journal of Learning Design., 10(1), 20–30. https://doi.org/10.5204/jld.v9i3.293.
Greller, W., Santally, M., Boojhawon, R., Rajabalee, Y., & Kevin, R. (2017). Using learning
analytics to investigate student performance in blended learning courses. Journal of Higher
Education Development–ZFHE, 12(1) Retrieved
from: http://www.zfhe.at/index.php/zfhe/article/view/1004.
Guri-Rosenblit, S. (2009). Distance education in the digital age. Common Misconceptions and
Challenging Tasks. Journal of Distance Education. Retrieved from: https://eric.ed.gov/?
id=EJ851907
Handelsman, M., Briggs, W., Sullivan, N., & Towler, A. (2005). A measure of college student
course engagement. The Journal of Educational Research., 98(3), 184–
192. https://doi.org/10.3200/JOER.98.3.184-192.
Hartman, J., & Truman-Davis, B. (2001). Factors relating to the satisfaction of faculty teaching
online courses at the University of Central Florida. Online education, 2, 109–128.
Ismail, A. (2018). Empowering your students satisfaction with blended learning: A lesson from
the Arabian Gulf University distance teaching and training program. International Journal of
Information and Education Technology, 8(2), 81–
94. https://doi.org/10.18178/ijiet.2018.8.2.1019.
Kagklis, V., Karatrantou, A., Tantoula, M., Panagiotakopoulos, C., & Verykios, V. (2015). A
learning analytics methodology for detecting sentiment in student fora: A case study in distance
education. European Journal of Open, Distance and E-learning., 18(2), 74–
94. https://doi.org/10.1515/eurodl-2015-0014.
Kahu, E. (2013). Framing student engagement in higher education, studies in higher education,
38:5, 758-773, https://doi.org/10.1080/03075079.2011.598505.
Kember, D., & Ginns, P. (2012). Evaluating teaching and learning: A practical handbook for
colleges, universities and the scholarship of teaching. Routledge., 66, 375–
377. https://doi.org/10.1007/s10734-012-9557-9.
Kugamoorthy, S. (2017). Activity based learning: An effective approach for self-regulated
learning practices. Department of secondary and tertiary education, faculty of education. The
Open University of Sri Lanka
Kuh, G. (2003). What we're learning about student engagement from NSSE: Benchmarks for
effective educational practices. Change: The Magazine of Higher Learning., 35(2), 24–
32. https://doi.org/10.1080/00091380309604090.
Lauría, E., Baron, J., Devireddy, M., Sundararaju, V. and Jayaprakash, S. (2012). Mining
academic data to improve college student retention: An open source perspective. In Proceedings
of the 2nd International Conference on Learning Analytics and Knowledge (pp. 139-142).
ACM. https://doi.org/10.1145/2330601.2330637.
Lee, J. (2010). Online support service quality, online learning acceptance, and student
satisfaction. The Internet and Higher Education, 13(4), 277–
283. https://doi.org/10.1016/j.iheduc.2010.08.002.
Lee, J., Song, H., & Hong, A. (2019). Exploring factors, and indicators for measuring students’
sustainable engagement in e-learning. Sustainability, 11(4),
985. https://doi.org/10.3390/su11040985.
Lehmann, T., Hähnlein, I., & Ifenthaler, D. (2014). Cognitive, metacognitive and motivational
perspectives on preflection in self-regulated online learning. Computers in Human Behaviour,
32, 313–323. https://doi.org/10.1016/j.chb.2013.07.051.
Li, N., Marsh, V., & Rienties, B. (2016). Modelling and managing learner satisfaction: Use of
learner feedback to enhance blended and online learning experience. Decision Sciences Journal
of Innovative Education., 14(2), 216–242. https://doi.org/10.1111/dsji.12096.
Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the necessary condition for
engagement in an online learning environment based on learning analytics approach: The role of
the instructor. The Internet and Higher Education, 24, 26–
34. https://doi.org/10.1016/j.iheduc.2014.09.005.
Macfadyen, L., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed
to inform an institutional strategic plan. Educational Technology & Society, 15(3), 149–163
Retrieved from https://www.jstor.org/stable/pdf/jeductechsoci.15.3.149.pdf.
Markova, T., Glazkova, I., & Zaborova, E. (2017). Quality issues of online distance
learning. Procedia-Social and Behavioral. Sciences., 237, 685–
691. https://doi.org/10.1016/j.sbspro.2017.02.043.
Marsh, H. (1982). SEEQ: A reliable, valid, and useful instrument for collecting Students’
evaluations of university teaching. British journal of educational psychology., 52(1), 77–
95. https://doi.org/10.1111/j.2044-8279.1982.tb02505.x.
Mihanović, Z., Batinić, A. B., & Pavičić, J. (2016). The link between Students' satisfaction with
faculty, overall Students’ satisfaction with student life and student performances. Review of
Innovation and Competitiveness: A Journal of Economic and Social Research, 2(1), 37–
60. https://doi.org/10.32728/ric.2016.21/3.
Moore, J. (2009). A synthesis of Sloan-C effective practices. Journal of Asynchronous Learning
Networks, 13(4), 73–97. https://doi.org/10.24059/olj.v13i4.1649.
Ni, A. (2013). Comparing the effectiveness of classroom and online learning: Teaching research
methods. Journal of Public Affairs Education., 19(2), 199–
215. https://doi.org/10.1080/15236803.2013.12001730.
Nichols, M. (2003). A theory for eLearning. Educational Technology & Society, 6(2), 1–10
Retrieved from: https://www.jstor.org/stable/jeductechsoci.6.2.1.
Pardo, A., Han, F., & Ellis, R. A. (2017). Combining university student self-regulated learning
indicators and engagement with online learning events to predict academic performance. IEEE
Transactions on Learning Technologies, 10(1), 82–
92. https://doi.org/10.1109/TLT.2016.2639508.
Potter, W. J., & Levine-Donnerstein, D. (1999). Rethinking validity and reliability in content
analysis. Journal of Applied Communication Research, 27(3), 258–
284. https://doi.org/10.1080/00909889909365539.
Rajabalee, Y, B., Santally, M, I., Rennie, F., (2020). Modeling students’ performances in
activity-based E-learning from a learning analytics Perspective: Implications and Relevance for
Learning Design. International Journal of Distance Education
Technologies. https://doi.org/10.4018/IJDET.2020100105
Ramsden, P. (1991). A performance indicator of teaching quality in higher education: The course
experience questionnaire. Studies in higher education., 16(2), 129–
150. https://doi.org/10.1080/03075079112331382944.
Rienties, B., Lewis, T., McFarlane, R., Nguyen, Q., & Toetenel, L. (2018). Analytics in online
and offline language learning environments: The role of learning design to understand student
online engagement. Computer Assisted Language Learning, 31(3), 273–
293. https://doi.org/10.1080/09588221.2017.1401548.
Robinson, C., & Hullinger, H. (2008). New benchmarks in higher education: Student
engagement in online learning. Journal of Education for Business, 84(2), 101–
108. https://doi.org/10.3200/JOEB.84.2.101-109.
Roblyer, M., & Wiencke, W. (2004). Exploring the interaction equation: Validating a rubric to
assess and encourage interaction in distance courses. Journal of Asynchronous Learning
Networks, 8(4), 24–37.
Sampson, P., Leonard, J., Ballenger, J., & Coleman, J. (2010). Student satisfaction of online
courses for educational leadership. Online Journal of Distance Learning Administration, 13(3)
Retrieved from:https://eric.ed.gov/?id=EJ914153.
Sinclair, P., Levett-Jones, T., Morris, A., Carter, B., Bennett, P., & Kable, A. (2017). High
engagement, high quality: A guiding framework for developing empirically informed
asynchronous e-learning programs for health professional educators. Nursing & Health
Sciences., 19(1), 126–137. https://doi.org/10.1111/nhs.12322.
Smallwood, B. (2006). Classroom survey of student engagement. Retrieved
from: http://www.unf.edu/acadaffairs/assessment/classe/overview.html.
Smith, V. C., Lange, A., & Huston, D. R. (2012). Predictive modeling to forecast student
outcomes and drive effective interventions in online community college courses. Journal of
asynchronous learning networks, 16(3), 51–61. https://doi.org/10.24059/olj.v16i3.275.
Strang, K. (2017). Beyond engagement analytics: Which online mixed-data factors predict
student learning outcomes? Education and Information Technologies., 22(3), 917–
937. https://doi.org/10.1007/s10639-016-9464-2.
Sulisworo, D. (2013). The paradox on IT literacy and science’s learning achievement in
secondary school. International journal of evaluation and research in education (IJERE), 2(4),
149–152. https://doi.org/10.11591/ijere.v2i4.2732.
Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for
feedback generation: Learning analytics in a data-rich context. Computers in Human Behavior,
47, 157–167. https://doi.org/10.1016/j.chb.2014.05.038.
Thanh, N., & Viet, N. (2016). How to increase student’s satisfaction at higher education
institutes (HEIs) today? Journal of Studies in Social Sciences and Humanities, 2(4), 143–151.
Virtanen, M. A., Kääriäinen, M., Liikanen, E., & Haavisto, E. (2017). The comparison of
students’ satisfaction between ubiquitous and web-basedlearning environments. Education and
Information Technologies, 22(5), 2565–2581. https://doi.org/10.1007/s10639-016-9561-2.
Yunus, K., Wahid, W., Omar, S., & Ab Rashid, R. (2016). Computer phobia among adult
university students. International Journal of Applied Linguistics and English Literature, 5(6),
209–213. https://doi.org/10.7575/aiac.ijalel.v.5n.6p.209.
Zepke, N., & Leach, L. (2010). Improving student engagement: Ten proposals for action. Active
Learning in Higher Education, 11(3), 167–177.
Zerihun, Z., Beishuizen, J., & Van Os, W. (2012). Student learning experience as indicator of
teaching quality. Educational Assessment, Evaluation and Accountability., 24(2), 99–
111. https://doi.org/10.1007/s11092-011-9140-4.