Professional Documents
Culture Documents
net/publication/280485420
CITATIONS READS
0 1,332
2 authors:
Some of the authors of this publication are also working on these related projects:
UNDERSTANDING THE READINESS OF IMPLEMENTING OUTCOME-BASED EDUCATION AMONG SELECTED HIGHER EDUCATION INSTITUTIONS IN PHILIPPINES View project
All content following this page was uploaded by Gerry Paul Cacho Genove on 27 July 2015.
ABSTRACT
Result of such can equip educational and technology administrators informed decisions
on upgrade or purchase or development of new LMS solutions.
KEYWORDS
Learning Management System, LMS features, technical features, support structure, LMS, Open
BBR, profiling
1. INTRODUCTION
Education has become a commodity in which people seek to invest for their own
personal gain, to ensure equality of opportunity and as a route to a better life[3]. The
introduction of e-learning poised the Philippine government to effect changes in the landscape
of education in the Philippines. It created a number of initiatives in partnership with external
agencies since the year 2000. One form of e-learning tool that is widespread today is the use of
a Learning Management System. This is sometimes referred to as Course Management
Systems(CMS) or Virtual Learning Environments(VLE). A Learning Management System(LMS)
is a software application or Web-based technology used to plan, implement, and assess a
specific learning process[4]. Vovides et. al [5] noted that LMS are utilized in education in
different ways and are evolving. It can be used as a supplement to the traditional classroom
curriculum, i.e., as an electronic repository of course materials. Instructors who teach in-class
courses may also choose to use a ‘blended’ approach by utilizing the CMS as a tool to deliver
additional or supplemental course materials to students. Traditional teaching environments tend
to be teacher-centered while a blended approach allows instructors to mix things up and to offer
students a more intellectually engaging learning experience by combining in-class time with on-
line components through the use of synchronous and asynchronous tools. Finally, a CMS can
be used in distance education for the delivery of fully online courses.
The implementation of CMS in universities followed on the revolution of educational
technology that promised better quality, learner-centered education and stipulates that it would
deliver more independent and active students[6]. There are evidences that LMS is a preferred
e-learning tool that is continuously gaining popularity and is being embraced as a major platform
in the educational technology revolution.
The presence of e-learning methods and technologies would not be sufficient means of
enhancing the learning process. According to Wagner et. al [7], Successful implementation of
e-learning is dependent on the extent to which the needs and concerns of the stakeholder
groups involved are addressed. Problems in the implementation and utilization of an LMS exist.
According to Evans [8] “There are many reasons for determining why teaching staff is not
integrating technology into their classroom lessons”. The purpose of this study to present a
framework that will maximize the potentials of the existing LMS by identifying current
implementation issues and problems in line with features, technical setup, support structure and
utilization through the process of benchmarking with the standards. By using the gap analysis,
a set of requirements can be formulated wherein the real essence of its introduction will be
realized in an efficient manner.
2. METHODOLOGY
Evaluating software is a significant task for corporate IT managers, but probable users of
open source software lack an easy, effective, and trustworthy process for decision making.
There is no widely used model for assessment. This complicates open source adoption, as
companies assessing open source software can rarely learn from each other’s experiences [9].
Several methodologies exist in evaluating open source software. These often include a set of
criteria for the evaluation. Some examples of these evaluation methodologies are the Open
Source Maturity Model by CapGemini (OSMM CapGemini), Open Source Maturity Model
developed by Navicasoft’s Bernard Golden (OSMM Navica) and Qualification and Selection of
Open Source Software (QSOS) developed by Atos Origin. Another methodology for an Open
Standard in evaluating OSS is the Open Business Readiness Rating (Open BRR) developed by
Carnegie Mellon West, SpikeSource, O'Reilly, and Intel. Its goal is to enable the entire
community (enterprise adopters and developers) to rate software in an open and standardized
way. By adopting a framework for evaluating software, the current LMS features profile and the
standard LMS features profile can be created. This study implements the OpenBRR model as
an example to establish the features profile.
The profiling method for the current LMS includes review of product documentation, the
LMS software website and web searches for related documents can be used to complete the
needed information. Moreover, the profiling method for the standard LMS includes identification
of properties and features that should be present in an LMS software by reviewing related
documents pertaining to best practices in the implementation of an LMS software.
This section profiles the Hardware and Software components being utilized by the
current LMS. The hardware running the LMS, the server technologies that were utilized and the
operating system of the server formed this profile. Moreover, the standard hardware and
software LMS profile will be created using hardware and software recommendation from the
LMS vendor or developer.
The rubric for assessing ICT infrastructure [10] was used to identify the network and
internet connection profile and identify acceptable standards which were used in the
assessment in this area. Mohktar et. al [10] presented in his study an evaluation tool that will
evaluate the readiness of an institution for e-learning. Rubrics are set of categories that define
and describe important components of the areas being assessed. Each category contains a
gradation of levels of implementation with a score assigned to each level [11]. The rubric
adopts a 3-point scale to differentiate the levels of ICT implementation. The three categories of
ICT implementation are descriptively labeled as low, moderate and high to represent the lower,
middle and upper tier of the rubric scale. Part of this rubric particularly the area to measure the
network and internet was used to evaluate the network and internet infrastructure utilized by the
current LMS. Defining the standard for each criterion entails a review of acceptable network and
internet connection standards applied to HEIs. The rubric that is used to measure the network
and internet infrastructure is shown on Table 1.
Table 1 : Network and Internet Assessment based on the Rubric formulated by Mohktar[10]
Levels of implementation
Network and Internet Indicators
Low Moderate High
Source :http://eprints.utm.my/3115/1/Rubric_For_Assessing_ICT_Infrastructure_in_Malaysia_Higher_Education_2006.pdf
2.1.3 Support Structure Profile
This section identified the support structure for the administration and support related
activities involving the end-users. The rubric for assessing the support structure shown in table
2 was used to create the current LMS support structure and LMS standard support structure
profile. The support structure assessment rubric was derived from User Support Best practices
identified by Towns et. al [1] and variables found on the ICT Maturity Tool specifically the ICT
Organizational Support Assessment identified by Working Groups Experts [2]. The ICT Maturity
tool was developed by Carnegie Mellon and International Development Research Center
(IDRC) to assess the readiness of an institution in integrating ICT in HEIs. The assessment is
based on knowledge base of global trends in ICT applications in different universities and
studies of academic experiences and best practices. A component of this tool addresses the
ICT Organizational Support which is aimed in measuring key indicators addressing the user
support structure. The ICT Organizational Support determines the success or failure of ICT
applications in higher education. Since the study was limited to the support structure concerns
of the LMS system, other components found on the tool were not included. Towns et. al [1]
identified a support model that enumerates key components that should be present in a
support system. These components were used to identify the profile of the LMS system
particularly addressing the support model components. The support model components
identify functional areas or variables that are critical in the implementation of a support system
structure. Best practices and experiences by LMS experts in the industry identified by the
eLearning Guild were reviewed to formulate metrics for each component found on the rubric.
Table 2 : Rubric for Assessing the Support Structure
Scoring
Component /
Description
Variable
1-
5 – Excellent 4 - Very Good 3 - Acceptable 2 - Poor
Unacceptable
User Information Availability of Complete; Regularly Available and Available but Not available
and Tools Frequently asked Available online; updated, almost almost complete incomplete (less
questions, discussion Regularly complete(above (above 50%) but than 50%) and not
forums and online updated 75%) and not updated on a updated
documentation. available online regular basis
and updated
Service Level Appropriately set the Complete set of Majority of the Around half of the Less than 50% None
Agreements & shared expectations SLAs and policies users adopts users adopts adopts SLAs and
Policies the users of the LMS SLAs and SLAs and policies policies
and those providing policies(above
75%)
Support
User Accounts All users need to Supported online Supported online None
and Allocation obtain some type of and measures with limited
Procedures account and some are in place in capabilities/
form of authorization to case problems capacity
use specific occur.
Resources
Education and The users of need to Regular and Scheduled Scheduled On-demand and none
Training be educated and scheduled training but limited training but limited attended by
trained in their use training; available to the personnel to the critical selected or few
to all users. New who acts as 1st or personnel who users
faculty and 2nd line of support acts as 1st line of
students are in the department. support in the
subjected to He is then department
training. Trained responsible in Teachers train
teachers train training his peers their students
their students in the department.
Trained teachers
train their
students
Help Desk No support function Available 24/7 Limited to office Limited to office Limited to none
Process would be complete Full time school- hours but online / hours availability of
without the core of staff based or agency electronic or auto- Part-time school- support staff
providing day-to-day support with mated alterna- based or agency Technical support
assistance to additional staff tives are available support response time
the users of the (including faculty) Full time school- greater than 24
based or agency Most technical
resources and services Most technical support response hours
available support capable
support response of trouble time is less than
time is less than shooting basic 24 hours
4 hours network and
hardware repair
including assistive
technologies
Technical support
response time is
less than 8 hours
Support Staff The support staff must Complete set of Complete set of Most Information Incomplete set of none
Information and have at their disposal a information which information which are available tools and
Tools number of “tools of the is digitally is digitally (more than information
trade” and information available and available and 75%)and
resources to effectively support tools are support tools(more than
provide support to the complete and tools(more than 60%) are
user community available 80%) are available
available
Measuring A support group needs An online An online An online An online feedback No Feedback
Success through some way to determine feedback system feedback is feedback is is available. System is
a Feedback success or failure of is available. available. available. Technology not available.
System problem solving and Technology Technology Technology used used to review
support regularly used to frequently used to infrequently to student
methods review user review user review student assessment
assessment assessment assessment information.
information which information information.
results in needed
changes.
Support Staff
Support Where is end-user A support system Presence of 1st, Presence of 1st, Centrally available None exists
responsibilities support available? Can that is and 2nd level of level of support with limited capacity
either be or a systematically & support;
combination of the efficiently
following ; Centralized structured
support, decentralized combines the
support(1st and 2nd strengths of the
level support), different end-user
presence of Service support;
Level Presence of 1st,
Agreements(SLAs) 2nd and 3rd level
of support;
Defines SLAs
Staff in the The staff caters to the Network Ma- Network Network Network Very limited /
following following set of nagement only management only management management only None exists
technical ICT responsibilities plus administra- plus only plus
areas Network management, tive system administrative administrative
Administrative system analysis and system system
analysis and design,
design plus analysis and analysis and
Intranet and internet
Hardware main-
application design plus design
tenance and
development, Hardware
repair and Data-
Database maintenance
base manage-
management,
ment plus Intra- and repair
Hardware maintenance
net and Internet and Database
and repair, Help desk
application management
development
and help desk
Staff in the Systems System 1st and/or 2nd level maintenance -Systems Very limited /
following ICT administration, System administrator of user support and control administration only None exists
functional areas maintenance and (e.g. library and
control, Primary for online courses (e.g. library
system of user support instructional and archives archives system,
technology (to finance,
system,
combine finance, student student Registration
pedagogy with system,
registration
technology) human
system, human
resources, etc)
resources,
1st and/or 2nd
level of user etc)
support
This section profiles the utilization of the current LMS. This identifies the utilization level
of the LMS for both faculty and students. Server statistics coming from the LMS can be used to
capture how the system was utilized on the previous semesters and how it is was utilized in the
current semester. A survey on how the LMS is being utilized and key factors affecting their use
of the LMS will help profile the current utilization of the LMS. Information like the distribution of
faculty and student users, the tools and the time the LMS is utilized should be presented to
better understand how the current LMS is utilized. A survey on how the LMS is being utilized
which identifies key utilization issues will help profile the current utilization of the LMS.
Benchmarking with HEIs who successfully implemented an LMS with high utilization levels and
interviews with school administrators defining the ideal utilization level will we inputs in coming
up with the standard utilization profile of the LMS.
2.2 Using Gap Analysis to Identify Standard requirements for the LMS
Gap analysis consists of defining the present state, the desired or `target' state and
hence the gap between them. In the later stages of problem solving the aim is to look at ways to
bridge the gap defined and this may often be accomplished by backward-chaining logical
sequences of actions or intermediate states from the desired state to the present state [12]. The
gap analysis was used for the features, technical and support structure assessment. Using
these inputs, gaps can be identified and measures or actions on how this gap can be bridged.
The assessment of the utilization profile of the current LMS is also presented in this section.
An evaluation of the LMS software was made in order to prove the competence of the
current LMS used. Gauging the competence of the LMS software is important to prove the
merit of the software being capable in meeting the needs of its users and intended purpose. A
gap analysis for each criterion as well as the overall rating of the current LMS was benchmarked
with the identified standard rating for each category and the overall standard rating. This
determined whether the current LMS software was fit and capable in meeting the competency
level of a standard LMS. Table 3 presents an assessment example of the features profile of
Saint Louis University(SLU) myClasses LMS with the features profile of the standard LMS.
Table 3 : Gap Analysis on the LMS Features
Target
Current
Percen- State –
Categories/Metrics State - My Gap Description
tage Stan-
Classes
dards
The standard set for the functionality
Functionality (25%) 4.4 4.1 0.3
criterion is satisfied
Average volume of
My Classes LMS meets standard that was
general mailing list in the 50 3 3 0
set for this sub-criterion
last 6 months (50%)
Number of security
My Classes LMS exceeds standard that
vulnerabilities still open 25 5 3 2
was set for this sub-criterion
(unpatched)
Is there a dedicated
My Classes LMS meets standard that was
information (web page or 25 5 5 0
set for this sub-criterion
wiki) for security? (25%)
The standard set for the adoption
Adoption (8%) 4.4 4.6 -0.2
criterion was not satisfied
How many book titles
does Amazon.com give The number of books listed in Amazon
for Power Search query: 20 2 3 -1 does not meet the standard for this sub-
“subject: computer and criterion
title:component name”?
This section presents the assessment of the hardware and software used by the My
Classes LMS benchmarked with the standards. The Gap analysis was used for this purpose.
The assessment was done in order to identify whether a specified requirement meets the
minimum requirement specified by the vendor or distributor of the LMS software based on the
released HW/SW documentation. Table 4 presents an assessment example of HW and SW
utilized by the SLU myClasses LMS with the standards using Gap Analysis.
Table 4 : Gap Analysis on the Hardware and Software Requirements
Target State
Requirement / Current State (My
(Recommended Gap Description
Property Classes LMS)
Specification)
Hardware
Dual Core AMD
1 Quad-Core Intel
Opteron 2210; Speed
Xeon Processor The processor of the My Classes LMS is
Processor : 1.8GHz, Level 2 +
X5450 (3.0GHz 12MB better than the standard that was set
cache :2X1MB
L2 1333MHz 120W
Cache,1Ghz
A gap analysis was performed to identify the gaps between the current network and
internet structure utilized by the LMS with the standards that were defined. Table 5 shows an
example of SLU’s myClasses LMS Gap Analysis on the Network and Internet Infrastructure.
Table 5 : Gap Analysis on the Network and Internet Requirements
User Accounts and Allocation Current setup meets the required standard for user
5 5 0
Procedures accounts and allocation procedures
Support Staff Information and The following were not present on the My Classes LMS:
4 5 -1
Tools Complete set of support staff information and tools
Support Staff
The following were not present on the My Classes LMS:
Support responsibilities 2 3 -1 st
presence of 1 level of support
Current setup meets the required standard for user
Staff in the following technical
4 3 +1 accounts and allocation procedures
ICT areas
The following were not present on the My Classes LMS:
st nd
1 and/or 2 level of user support
Staff in the following ICT
3 5 -2 System administrator for online courses instructional
functional areas
technology (to combine pedagogy with technology)
acting in full-time capacity
2.2.4 Utilization Assessment
The assessment of the My Classes utilization was performed through a quantitative and
qualitative analysis of the information collected. Factors affecting the stakeholders’ utilization of
the LMS were identified after a methodical analysis of data gathered. A qualitative analysis of
the comments given by the respondents further helped in identifying these factors. A review of
the current utilization profile and the optimal utilization goal that was defined through the review
of successful LMS implementation and goals set by school administrators can be used to
identify the gaps between the current utilization of the current LMS and the target(standard).
The information gathered after the analysis was used to create recommendations on possible
actions to improve the utilization of the My Classes LMS.
Features none -
Technical none -
Support Structure
The complete set of user information and tools provided by Dokeos should
Complete set of user be available on the SLU Network.
information and tools
A Frequently Asked Questions page should be created and reflect basic
answers to common questions asked by users.
End-user information
User Information Information and tools are customized according to the needs of the My
and tools that is locally
and Tools Classes LMS users. Online manuals for students and teachers that are
customized which is
localized should be created.
available online
End-user information Review and update end-user information and tools. This shall be done
and tools that is specially during times where changes or upgrades are done to the software
regularly updated or when a need for additional documentation is requested by the end-users
For new faculty, a scheduled training of the LMS should be done by the LMS
staff through the personnel office for newly hired faculty. The training should
introduce the features of the LMS, creation and administration of online
courses, and adding content to courses. The training should emphasize the
teacher’s role as prime movers for their students’ adoption of the LMS and
their active role of training their students in their use of the LMS.
Regular and scheduled For freshmen students, a brief introduction of the LMS can be done during
Education and training for student and their scheduled orientation. It is also suggested that an LMS primer be
Training faculty users included on the student handbook or primer that is distributed during
enrolment.
A scheduled training for all departments of all colleges should be done by the
LMS staff to ensure that every faculty receives LMS training.
During system upgrades or updates, all support staff are re-trained and
informed of significant changes
Technical response Create a dedicated support group to handle this function. The LMS support
time is less than 24 group should be able to handle all problems within the day wherein the
hours response time is less than 24 hours.
Help Desk Process
Duration of the whole
Availability of help desk process during office hours by delegating the
office hours period for
responsibility to the support staff.
help desk process
Set of support staff information should be well documented and available
online. The information should be updated frequently to reflect important
documentation about troubleshooting known end-user as well as technical
Support Staff Complete set of support
LMS problems.
Information and staff information and
Tools tools
Support tools such as browsers or browser plug-ins should be complete and
up to date. Other technical and networking tools should be available using a
common online repository like an FTP server.
A feedback mechanism through regular surveys conducted at least once a
year should replace the existing feedback mechanism. A survey to measure
user insights about the LMS should be done in order to get useful user
A reliable online
feedback information.
feedback system is
available.
Measuring Success A structured form of feedback mechanism through an online form document
through a Feedback should be available to replace feedback through e-mail which can be
System accomplished voluntarily.
Technology regularly
Feedback information is assessed and recommendations are created for the
used to review user
improvement of the LMS. This information should be benchmarked with
assessment information
similar studies to provide the best recommendations in further improving the
which results in needed
LMS.
changes.
A structure of support for end-users should be created. The responsibilities
of the support groups shall delineate key responsibilities for services
expected by the end-users. This should be embodied on the SLA that will be
formulated for students and teachers.
st st
Support Presence of 1 level of The 1 level of support for students shall be their instructor conducting the
responsibilities support online course. In case problems are unresolved, the faculty may seek
advice or help from his department head or may seek the help of the LMS
support group.
st
The 1 level of support for teachers/faculty shall be the department head.
Faculty may seek advice from the LMS staff if issues are unresolved.
st nd
1 and/or 2 level of A structure of support for end-users should be created to provide support
user support services to the end-users.
Staff in the System administrator
following ICT for online courses
functional areas A dedicated LMS system administrator should be appointed on a full time
instructional technology
basis or a review of the setup of functions that will allow the current LMS
(to combine pedagogy
administrator to achieve the same objective.
with technology) acting
in full-time capacity
Implement education and training requirements identified for support
Training structure particularly on training and education identified in the support
structure.
Lack of information to
Implement requirements identified under support structure
promote the LMS
Utilization Improve staffing responsibilities detailed under support structure
Further improve technical facilities to produce a more reliable system which
LMS reliability
encompasses all hardware equipments handled by the ICTR Lab and
SLUNet Office and IT equipments used by end-users in the university.
Address these issues by looking into the recommendations of the following
studies:
IT equipment problems Mercado (2008)
Parilla-Ferrer (2007)
Tagimacruz (2005)
3. CONCLUSION
4. REFERENCES
[1] Towns, J., Ferguson, J., Fredrick, D. and Myers, G., 2001,‘Grid User Support Best
Practices’, viewed 2009 October, <http://www.ggf1.nl/abstracts/GUS/ GridUserServicesBest
Practices-02221.pdf>
[2] Working Groups Experts 2000, ‘Guidelines for Institutional Self-Assessment of ICT maturity
in African Universities’, viewed 2009 September, <http://www.aau.org/english/ documents/ICT-
GUID.pdf>
[3] Davies, D. 1998, ‘The Virtual University: A Learning University’, The Journal of Workplace
Learning, vol. 10, no. 4, pp. 175 – 213.
[4] Paulsen, M, 2002, ‘Online Education Systems in Scandanavian and Australian Universities:
A Comparative Study’, The International Review of Research in Open and
Distance Learning Journal, vol. 3, no. 2., viewed 2009 July, <http://www.irrodl.org/
index.php/irrodl/article/view/104/559. Retrieved July 10, 2009>
[5] Vovides, S., Sanchez-Alonzo, S., Mitropoulou,V. and Nickmans, G., 2007. ‘The Use of e-
Learning Course Strategies and to Improve Self-regulated Learning’, Education Research
Review, vol. 2, pp. 64–74.
[6] Swinney, L. A. , 2004, ‘Why Faculty Use a Course Management System(blackboard) to
Supplement their Teaching of Traditional Undergraduate Courses’, Doctor of
Philosophy Dissertation, University of North Dakota
[7] Wagner, N., Hassanein, K. and Head, M., 2008, ‘Who is responsible for E-Learning Success
in Higher Education: A Stakeholders’ Analysis’, Journal of Educational Technology and
Society, vol. 11, no. 3, pp. 26 - 36.
[8] Evans, K., 2005, ‘Front End Analysis Plan for the Underutilization of Technology at Lincoln
Middle School’, viewed 2009 September, <http://www.kristenevans.info/su/
reports/FEAplan.pdf>
[9] SpikeSource, the Center for Open Source Investigation at Carnegie Mellon West, and Intel
Corporation, 2005, ‘Business Readiness Rating for Open source; A Proposed
Open Standard to Facilitate Assesment and Adoption of Open Source Software, RFC1’ viewed
2009 July, http://www.openbrr.org/docs/BRR_whitepaper_2005RFC1.pdf
[10] Mokhtar, S., Alias R., Rahman A., 2007, ‘Rubric for Assessing ICT Infrastructure in
Malaysia Higher Education’, viewed 2009 September, <http://eprints.utm.my/3115/1/
Rubric_For_Assessing_ICT_Infrastructure_in_Malaysia_Higher_Education_2006.pdf
[11] Pickett, N., 1998, Creating Rubrics [online], viewed 9 February 2006,
<http://teacher.esuhsd.org/rubrics/ >
[12] UC 2008, ‘University of Canterbury LMS Review – Final Report and Recommendations’,
viewed 2009 August, http://uctl.canterbury.ac.nz/
files/staff/moodle/Final%20Report%20of%20the%20LMS%20Review%20Steering%20
Group%20-%20public%20version.pdf
[13] Deprez, J.C. and Alexandre, S., 2008, ‘Comparing Assessment Methodologies for
Free/Open Source Software: OpenBRR & QSOS’, viewed 2009 October,
<http://www.qualoss.org/dissemination/DEPREZ_CompareFlOSS AssessMethodo-Camera-
02.pdf
[14] Letellier F., 2009, ‘FOSS-Bridge’, viewed 2009 October,<http://netnam.vn/foss-
bridge/uploads/Main/Foss-Bridge-Block27.pdf >
[15] De Silva, C., 2009, ‘Open source software assessment methodologies’, viewed 2009
October, <http://www.itpro.lk/?q=node/2814>
[16] Center for learning & Performance technology for Instructional Course tools, 2009,
‘Learning Tools Compendium’, viewed 2009 December,http://www.c4lpt.co.uk/Directory/
Tools/instructional.html
[17] Kvavik, R., Caruso, J., Morgan, G., 2004, ‘ECAR Study of Students and Information
Technology, 2005:Convenience, Connection, Control and Learning’, EDUCAUSE Center for
Applied Research, vol. 5
[18] Edutools 2005, ‘EduTolls Website’, viewed 2009 December,
http://www.edutools.info/glossary.jsp?pj=4
[18] Smith, S., Salaway, G., Caruso, J., Katz, R., 2009, ‘ECAR Study of Undergraduate
Students and Information Technology, 2009’, EDUCAUSE Center for Applied
Research, vol 6 [18] McHenry, B., 2010, ‘New Features for Learning
Management Systems’, viewed 2010 January, http://www.sloanconsortium.org/
publications/magazine/v3n2/mchenry.asp