You are on page 1of 8

Intelligent distance learning systems

Dragan Vasiljević1, Branko Latinović2


PhD candidate at Pan-European University APEIRON, Banja Luka, Bosnia and Herzegovina
Professor at Pan-European University APEIRON, Banja Luka, Bosnia and Herzegovina
vasiljevicdj68@gmail.com, branko.b.latinovic@apeiron-edu.eu

Abstract: Models used for creating intelligent systems based on artificial non-
chromic networks indicate to the teachers which educational as well as teaching activities
should be corrected. Activities that require to be corrected are performed at established
distance learning systems and thus can be: lectures, assignments, tests, grading,
competitions, directed leisure activities, and case studies. Results regarding data
processing in artificial neural networks specifically indicate a specific activity that needs
to be maintained, promoted, or changed in order to improve students’ abilities and
achievements. The developed models are also very useful to students who can understand
their achievements much better as well as to develop their skills for future competencies.
These models indicate that students’ abilities are far more developed in those who use
some of the mentioned distance learning systems in comparison with the students who
learn due to the traditional classes system.
Keywords: neural networks, distance learning system, achievements,
competencies

INTRODUCTION development and application of


algorithms towards the prediction of
Predicting future events is one
neural networks use in distance
of the basic areas within artificial
learning systems.
application neural networks [1].
Most research work concerning
Hereby, these future events are
to the use of artificial neural networks
necessary abilities as well as students’
in distance learning systems is one-
achievements. Unlike classic methods,
dimensional just because it focuses on
based on models, artificial neural
methods and techniques applied to
networks belong to the class of self-
either the electronic learning system
adaptive methods based on data with
only or especially to students and
only a few model assumptions for
teachers [4-6] while fewer papers and
problems being studied [2]. The
research deal with the problem of
research problem is in correlation with
learning based on past events [7-8].
activities in the teaching process that
can be promoted. Artificial neural So far, this area has not been
networks throughout data testing and sufficiently explored, especially the
training should provide a projection of field of e-learning as indicated by the
students’ abilities as well as great number of scientific work
competence [3]. The subject of published in 2019 which refers to the
research in this paper relates to the robotics, economy as well as natural
1

2
phenomena [9-13]. To create model creation neural networks [23].
intelligent distance learning systems, The output variable is a satisfactory
this paperwork will display the method level of achievement. Defined
of analyzing data using artificial neural variables data refer to 102 elementary
networks. school students (eighth graders)
analyzed during the school year 2018-
METHOD
2019. The data has been updated into
Artificial neural networks are the Neural designer studio database for
an integral part of artificial intelligence the purpose of this paperwork due to
primarily used for numerical prediction further processing and creating an
[14,15], classification [16,17] and artificial neural network. The statistics
pattern recognition [18,19, 20]. regarding defined input variables are
important information for designing
The data should be divided into models since they can alert to the
three samples: the one for coaching, presence of false data. Table 1 displays
cross-validation, and testing [20]. the minimum, maximum, middle, and
Upon model defined, the input is standard deviation of all variables in
prepared, the algorithm selected, the data set.
learning rule, and required functions
established, the network should be Table 1
taught or trained on the basis of
prepared data in order to identify the
connection among data likewise to be
capable of predicting values based on
input values. The learning phase is a
process of adjusting network weights
that take place in multiple iterations or
passing through the network. In the
cross-validation phase, the network
tends to observe the length training, the
Table 2 displays the value of the
number of hidden neurons as well as
correlations of all input variables. The
parameters. Network testing is the
minimum correlation is -0.440712
third phase of neural network operation
between the lecture and competition
and it is crucial for network evaluation.
variables. The maximum correlation is
The difference relating to the learning
0.9555376 between tasks and tests.
and the testing phase is that in this
phase the network is no longer in the Table 2
learning process. Network evaluation
is done by calculating the error in a
way to compare the network output
with the actual outputs [22].
RESULT
Lectures, assignments, tests,
grading, competitions, directed leisure
activities are input variables important
for in-depth data analyses as well as
An artificial neural network is min f (x)=2 x 21+ 3 x 22 (2)
defined as a single layer with
forwarding propagation while the Starting point selection:
activation function is a Hyperbolic x1 1
Tangent function which can be
mathematically expressed as follows:
[ ][]=
x2 1
(3)

Hessian matrix inverse calculation :


e x −e−x
(
tghX = x −x
e +e ) (1)
2 x1

Where X represents an independent


∇f (x) =
[ ]
3 x2
(4)

variable size. Graphic representation of


the Hyperbolic Tangent of the function 2 0
is given in Figure 1. [ ]
∇ 2f (x) = 0 3 (5)

−1
H = [ 0,50 0,30 ] (6)

Finding a new value for variable x:

x k+1= 1 − 0,5 0
[][
1 0 0,3 ] (7)
Specifying a new value for variable x :
Figure 1. Graphic
0
representation of Hyperbolic Tangent
function
[]
x k+1 ¿ 0 (8)

The total number of variables Determining the value in case the


for a neural network is seven, of which function converges :
one variable is a target variable ∇f (x) = 0 (9)
referring to a satisfactory level of
achievement. The training strategy The training of the data set has
over the data set is defined by the been performed within a thousand
optimization algorithm using the interactions during one hour of testing
Quasi-newton method. The time. Upon the performed training,
optimization process is approximately Figure 2 displays the training as well
the same as the ordinary Newton as selection error in each iteration.
method with Hessian matrix step Blue line represents a training error
modification. and the orange one represents a
selection error. The home value of the
The following is a brief training error is 2.33562 and the final
numerical example of one type of value after 135 epochs is 9.72136e-6.
Quasi-newton method that uses the The initial value of the selection error
original Hessian inverse matrix for is 3.444445 and the final value after
each iteration. 135 epochs is 0,382761.
Target function :
Table 3

Figure 2. Graphic representation of Figure 4 displays a graph


error history using the Quasi-newton illustrating the dependence of the
method target variable with input variables.
Figure 3 provides a graphic
representation of the resulting deep
artificial architecture neural networks.
It contains a scaling layer, a neural
network as well as a non-scaling layer.
The yellow circles represent the scaled
neurons, the blue ones represent
perceptron neurons and the red ones Figure 4- graph view illustrating the
represent neurons. The number of dependency of the target variable with
inputs is 6 and the number of outputs is the input variables
1. The number of hidden neurons is 1. In order to test the model, we
have assigned random variables to the
input variables by assigning the value
1 to three variables. The value of the
target variable is 0.459, which is,
closer to 0 than 1, which proves that
the model of artificial neural networks
provides expected results. Table 4
displays the allocated input values as
well as the target variable value.
Table 4
Figure 3 - Graphic representation of
artificial neural network architecture

The testing has been performed


on the data relating to the lecture
variable with dedicated value 1. Table
3 displays the value for the target
variable with value 0.0911 whose Within creating intelligent
correlation with original values has distance learning systems, based on the
results obtained by the data processed
been proved.
within an artificial neural network of graders) who have used the distance
the observed input variables: lectures, learning system as an adjunct to
assignments, tests, grading, traditional teaching, within the school
competitions as well as directed free year of 2018-2019, as well as 85
activities represent necessary facilities elementary school pupils within the
in achieving a satisfactory level of school year 2017-2018, taught in a
accomplishment.
traditional way, the greater success of
the pupils using the distance learning
DISCUSSION system is accomplished and evident.
Defined input variables and the CONCLUSION
data processing results, using artificial
The research presented in this
neural networks, are the basis of
paperwork indicates that by application
creating a distance learning system that
of the neural network in creating
needs to provide users with unhindered
intelligent distance learning systems,
access to all available resources on the
the achievement and competences of
portal distance learning systems [24].
primary school pupils can be
Content organization on the significantly improved. This
distance learning platform should paperwork displays a realized model of
provide conditions for that clear, artificial neural networks in the
transparent, and logical organization of function of the development towards
teaching content through lectures, organizing teaching content likewise
which in form and content, should be activities on a distance learning
tailored to the target audience. Upon platform.
the lectures execution on the platform,
A comparative analysis
tasks related to the completed After-
regarding two-generation eighth-
lectures are assigned. Further follow-
graders at their first year of high school
up of classes is enabled only after tasks
education indicates the implementation
have successfully been solved.
efficiency of the above-mentioned
Tests represent a separate unit contents of distance learning.
in the distance learning system,
The developed distance
periodically organized. Due to their
learning model has been applied at the
form, they can be classified into self-
elementary school level. However, it
evaluative and formal ones.
can be successfully applied both at a
Assessment is a constant activity
high school or at an academic level.
whereby the lecturer monitors all the
work as well as students’ engagement. Further development directions
for distance learning relate to the
Introducing contest-related
application, besides the methods of
content likewise directed leisure
neural networks likewise other
activities are innovations within
methods of in-depth data analyses.
distance learning systems that provide
trainees with competency development The aforementioned research
likewise greater learning motivation. results should form the basis for
further development of the distance
Due to the analyses including
learning systems.
102 elementary school pupils (eighth
classifier for analyzing students’
activities, volume 3 number 2 (pp. 87-
95). Journal of Information
LITERATURE Technology and Applications,
Aprerion, Banja Luka, Republic of
[1] Sharda, R., 1994. Neural networks
Srpska, BiH.
for the MS/OR analyst: An application
bibliography. Interfaces 24 (2), 116– [9] Ghorbanzadeh, O., Blaschke, T.,
130. Gholamnia, K., Meena, S. R., Tiede,
D., & Aryal, J., 2019. Evaluation of
[2] White, H., 1989. Learning in
different machine learning methods
artificial neural networks: A statistical
and deep-learning convolutional neural
perspective. Neural Computation 1,
networks for landslide detection.
425–464
Remote Sensing, 11(2), 196.
[3] Hornik, K., Stinchcombe, M.,
[10] Gil, C. R., Calvo, H., & Sossa, H.,
White, H., 1989. Multilayer
2019. Learning an efficient gait cycle
feedforward networks are universal
of a biped robot based on
approximators. Neural Networks 2,
reinforcement learning and artificial
359–366.
neural networks. Applied Sciences,
[4] McClure, P., & Kriegeskorte, N., 9(3), 502.
2016. Representational distance
[11] Nguyen, H., & Bui, X. N., 2019.
learning for deep neural networks.
Predicting blast-induced air
Frontiers in computational
overpressure: a robust artificial
neuroscience, 10, 131.
intelligence system based on artificial
[5] Saito, T., & Watanobe, Y., 2020. neural networks and random forest.
Learning Path Recommendation Natural Resources Research, 28(3),
System for Programming Education 893-907.
Based on Neural Networks.
[12] Liu, S., Oosterlee, C. W., &
International Journal of Distance
Bohte, S. M., 2019. Pricing options
Education Technologies (IJDET),
and computing implied volatilities
18(1), 36-64.
using neural networks. Risks, 7(1), 16.
[6] Syed, M. R., 2020. Methods and
[13] Weinstein, B. G., Marconi, S.,
applications for advancing distance
Bohlman, S., Zare, A., & White, E.,
education technologies: International
2019. Individual tree-crown detection
issues and solutions. Simulation, 348.
in RGB imagery using semi-supervised
[7] Heidari, A. A., Faris, H., Mirjalili, deep learning neural networks. Remote
S., Aljarah, I., & Mafarja, M., 2020. Sensing, 11(11), 1309.
Ant lion optimizer: theory, literature
[14] Witten, I.H., Frank, E., 2005. Data
review, and application in multi-layer
Mining: Practical Machine Learning
perceptron neural networks. In Nature-
Tools and Techniques, Morgan
Inspired Optimizers (pp. 23-46).
Kaufman: Burlington, MA, USA.
Springer, Cham.
[15] Mair, C., Kadoda, G., Le, M.,
[8] S. Milinković, M. Maksimović,
Phalp, K., Scho, C., Shepperd, M.,
2013. Case study: Using decision tree
Webster, S., 2000. An investigation of
machine learning based prediction implementation of data warehousing
systems. J. Syst. Softw. 53, (pp. 23– and data mining technologies, volume
29). 3 number 2 (pp. 107-112). Journal of
Information Technology and
[16] Kotsiantis, S.B., 2007. Supervised
Applications, Aprerion, Banja Luka,
Machine Learning: A Review of
Republic of Srpska, BiH.
Classification Techniques. Informatica
31, (pp. 249–268). [24] Tomić S., Drljača D., DDLM -
Quality Standard for Electronic
[17] Carpenter, G.A., Grossberg, S.,
Education Programs in Higher
Reynolds, J.H., 1991. ARTMAP:
Education of Bosnia and Herzegovina,
Supervised real-time learning and
JITA – Journal of Information
classification of nonstationary data by
Technology and Applications Banja
a self-organizing neural network.
Luka, PanEuropien University
Neural Netw. 4, (pp. 565–588).
APEIRON, Banja Luka, Republika
[18] Bishop, C.M., 2006. Pattern Srpska, Bosna i Hercegovina, JITA
Recognition and Machine Learning; 9(2019) 2:67-79, (UDC:
Springer: Berlin, Germany, Volume 4. 004.735.8:621.39(497.6), (DOI:
10.7251/JIT1902067T), Volume 9,
[19] Raudys, S.J.; Jain, A.K., 1991. Number 2, Banja Luka, December
Small sample size effects in statistical 2019 (49-128), ISSN 2232-9625
pattern recognition: Recommendations (print), ISSN 2233-0194 (online),
for practitioners. IEEE Trans. Pattern UDC 004.
Anal. Mach. Intell. 13, (pp. 252–264).
[20] Марковић,Љ., 2009. Примена
backpropagation алгоритма за
обучавање неуронских мрежа,
Симпозијум о операционим
истраживањима, SYM-OP-IS 2009
Београд, (pp.1-4), ISBN 978-86-
80593-43-2.
[21] Dr. David D. Pokrajac, 2013.
Predicting students’ results in higher
education using neural networks,
Conference: Applied Information and
Communication Technologies, At
Jelgava, Latvia.
[22] Зекић-Сушац, М., Фрајман-
Јакшић, А. и Дрвенкар, Н., 2009.
Неуронске мреже и стабла
одлучивања за предвиђање
успјешности студирања. Економски
вјесник, XXII, (pp.314-327).
[23] I. Isaković., 2013. CRM
performances accented with the
Dragan Vasiljević was born on September 16, 1968, in Skopje,
Republic of North Macedonia. He graduated from the Military
Academy in Rajlovac in 1991. At the faculty of Technical Sciences
Čačak he enrolls a master’s degree that ends in 2012. He is currently a
Ph.D. candidate at Pan-European University “Apeiron”, Banja Luka,
Bosnia and Herzegovina, Europe. Adress: Voždovac, V. Stepe 405D, Belgrade 11000,
Republic of Serbia.

Branko Latinović was born on April 28, 1956, in Prijedor, Bosnia and
Herzegovina. He graduated from the Faculty of Economics in Banja
Luka in 1980. At the same faculty he enrolls a master’s degree that
ends in 1994, and in 1997 successfully defended his doctoral
dissertation. He works as a dean of the Faculty of Information
Technologies of Pan-European University “Apeiron” in Banja Luka.

You might also like