You are on page 1of 15

1

A. SYSTEM USABILITY
The researcher adopted a system usability questionnaire by Lewis, J. R. (1995).
IBM Computer Usability Satisfaction Questionnaires: Psychometric Evaluation and
Instructions for Use. See Appendix A for the copy of the questionnaire. The system
usability test was conducted by the researcher in the six (6) BISU campuses, namely:
Bilar Campus, Main Campus, Candijay Campus, Clarin Campus, Calape Campus, and
Balilihan Campus from February to March 2013. The respondents were the librarians
and library personnel from each campus. The researcher demonstrated the system’s
features in detail, like borrower’s information profile, library item’s information, check
out, check in and reservation modules. The researcher also presented the library
statistical reporting and providing them the system processes manual (see Appendix C).
It took 2-3 hours spent during the presentation. The respondents did hands-on library
system processes. After the hands-on, they answered the system usability
questionnaire. The researcher was able to collect 16 respondents from different
campuses in the BISU library.

Table 7. Summary of Respondents in the System Usability


Campus No. of respondents Percentage
Bilar 4 25.00
Main 3 18.75
Candijay 2 12.50
Clarin 3 18.75
Calape 2 12.50
Balilihan 2 12.50
Total 16 100.00

The ranges of the interpretative guides were computed by getting the interval
value. The interval value for the system usability is 0.9 which is computed as follows:
Interval = Number of options – 1 / Number of options
= (7 – 1) / 7
= 0.9

The interpretative guide for the interpretation of the statistical results of the
system usability is presented in Table 8 on the next page.

Table 8. Interpretative Guide of the System Usability


Weight Range Description Interpretation
Strongly The respondents strongly believe and
7 6.4 – 7.0
Agree confident that the system is very usable.
6 5.5 – 6.3 Agree The respondents believe and confident
2

that the system is usable.


Tend to The respondents tend to believe that the
5 4.6 – 5.4
Agree system is usable.
Neither The respondents are neutral in trusting
4 3.7 – 4.5 Agree nor that the system is usable.
Disagree
Tend to The respondents tend not to trust that
3 2.8 – 3.6
Disagree the system is usable.
Disagree The respondents believe that the system
2 1.9 – 2.7
is not usable.
Strongly The respondents strongly confident that
1 1.0 – 1.8
Disagree the system is not usable.
Shown in Table 9 on the next page is the system usability result. The tabulated
results were computed through their weighted mean. The interpretation of the weighted
mean was based on the interpretative guide as indicated in Table 8 above. Table 9
shows the weighted mean and interpretation of each statement. The average weighted
mean of the usability questionnaire is 6.4 with the interpretation of “Strongly Agree.”
This implies that the system is strongly high in its usability. This result also suggests
that the system provides high satisfaction among the respondents. Likewise it reveals
also that the system is highly simple and easy to use, effective, efficient, informative,
easy to understand, and clear. Further, this implies also that majority of the respondents
strongly agree with the capabilities, functions, and the ease of use of the proposed
system.

Table 9. System Usability Result


Weighted Interpretatio
Criteria for System Usability
Mean n
1. Overall, I am satisfied with how easy it is to use Strongly
6.6
this system. Agree
2. It was simple to use this system. Strongly
6.5
Agree
3. I can effectively complete my work using this Strongly
6.4
system. Agree
4. I am able to complete my work quickly using this Strongly
6.4
system. Agree
5. I am able to efficiently complete my work using Strongly
6.4
this system. Agree
6. I feel comfortable using this system. Strongly
6.4
Agree
7. It was easy to learn to use this system. Strongly
6.7
Agree
8. I believe I became productive quickly using this Strongly
6.6
system. Agree
9. The system gives error messages that clearly tell
6.1 Agree
me how to fix problems.
10. Whenever I make a mistake using the system, I 6.0 Agree
3

recover easily and quickly.


11. The information (such as online help, on-screen
Strongly
messages, and other documentation) provided 6.4
Agree
with this system is clear.
12. It is easy to find the information I needed. Strongly
6.6
Agree
13. The information provided for the system is easy Strongly
6.5
to understand. Agree
14. The information is effective in helping me
6.3 Agree
complete the tasks and scenarios.
15. The organization of information on the system Strongly
6.5
screens is clear. Agree
16. The interface of this system is pleasant. Strongly
6.4
Agree
17. I like using the interface of this system. Strongly
6.4
Agree
18. This system has all the functions and
6.2 Agree
capabilities I expect it to have.
19. Overall, I am satisfied with this system. Strongly
6.6
Agree
Strongly
AVERAGE WEIGHTED MEAN 6.4
Agree

B. WEB USABILITY
The aspects of Web usability were evaluated using a Web Usability Survey
adopted from the Web Usability Survey developed by Massachusetts Institute of
Technology. See Appendix B for the copy of questionnaire. The Web Usability Survey
consisted of questions that rate the website’s aspects in Navigation, Functionality, User
Control, Language and Content, Online Help and User Guides, System and User
Feedback, Consistency, Error Prevention and Correction, and Architectural and Visual
Clarity.
During the administration of the study, a sample size of the respondents was
determined where the total number of population was based on the approximate
number of student population in all campuses of BISU which is 11, 000. The sample
size was rounded off to 386 respondents. Computation of the sample size using Slovin’s
formula is as follows:
n = N/1 + Ne2:
n = 11, 000 /(1+(11,000)(.05)2)
n = 386
where n = the sample size
N = the total population
e = the margin of error
Table 10 on page 83 presents the distribution of the respondents. Of the 386, 136
are BISU students, 56 are BISU faculty and staff, 165 are non-BISU students and 29
are non-BISU professionals.
4

During the actual survey administration, a random sampling procedure was utilized.
There were two different groups of the administration. The first group was the BISU
students, faculty and staff. The web usability test was conducted by the researcher in
the six (6) BISU campuses and access existing library records of BISU through the
internet from February to March 2013. This group of respondents was BISU students
and faculty members/staff. They were randomly selected during a computer laboratory
classes for the reason that the demonstration has to be conducted inside the computer
laboratory. The researcher asked an approval from the campus administrator, dean or
the instructor’s in-charge in the laboratory room before the demonstration and testing
process. The campus librarian assisted also during the conduct of the web usability
evaluation. During the demonstration, the Online Public Access Catalog (OPAC)
features were presented in detail through the web and the respondents accessed the
OPAC processes. After using the OPAC, the respondents were asked to answer the
web accessibility survey questionnaire. A total of 192 BISU-respondents participated
during the web usability evaluation, wherein 136 were students while 56 were faculty
members and staff.
The second group of web usability testing was also conducted by the researcher
in Silliman University, Dumaguete City. This is to consider the evaluation coming from
the community perspective. This group of respondents browsed and accessed existing
library records of BISU through the Internet on April 17-19, 2013. This group of
respondents was students and other professionals. They were randomly selected during
a computer laboratory classes for the reason that the demonstration has to be
conducted inside the computer laboratory. The researcher asked an approval from the
College Dean before the demonstration and testing process. During the demonstration,
the Online Public Access Catalog (OPAC) features were presented in detail through the
web and the respondents accessed the OPAC processes. After using the OPAC, the
respondents were asked to answer the web accessibility survey questionnaire. A total of
194 respondents of this group participated during the web usability evaluation, wherein
165 were students from Silliman University while 29 were other professionals.

Table 10. Summary of Respondents in the Web Usability

No. of
Type of Respondents Percentage
respondents
BISU Students 136 35.23
BISU Faculty/Staff 56 14.51
Non-BISU Students 165 42.75
Non- BISU
Professionals 29 7.51

Total 386 100.00

Similar with the system usability evaluation, the ranges of the interpretative guide
for web usability were computed by getting the interval value. The interpretative guide
5

for the interpretation of the statistical results of the web usability is presented in Table
11 below.
Table 11. Interpretative Guide of the Web Usability

Weight Range Description Interpretation


The respondents strongly believe and
confident that the website is excellent in all
5 4.3 – 5.0 Excellent
aspects in design, development and
implementation.
The respondents believe and confident that
the system is very usable. They are also
Very confident that if ever the website may have
4 3.5 – 4.2
Good minor inconsistencies and aesthetic issues,
these are manageable and will not affect
the performance of the proposed system.
The respondents believe and confident that
the system is usable. They are also
3 2.7 – 3.4 Good confident that if ever the website may have
problems, these are non-critical and will not
cause major confusion or irritation.
The respondents are neutral in trusting that
the website is usable. They also believe
that a serious problem occurred in the
2 1.9 – 2.6 Fair
website that needs high priority to fix than
can cause a user to make a significant
error.
The respondents believe that the website is
1 1.0 – 1.8 Poor not usable. They believe also that the
website has severe problem.

Table 12. Web Usability Results


Non-
BISU Non-
BISU BISU Weigh
Facu BISU Descri
WEB USABILITY CRITERIA Stude Profe ted
lty/St Stude p-tion
nts ssio- Mean
aff nts
nal
Excelle
I. Navigation 4.2 4.3 4.1 4.7 4.3
nt
1.1 Current location within the site is Excelle
4.3 4.3 4.1 4.7 4.4
shown clearly. nt
1.2 Link to the site’s main page is Excelle
4.1 4.3 4.0 4.8 4.3
clearly identified. nt
1.3 Major/important parts of the site
Excelle
are directly accessible from the 4.2 4.2 4.1 4.6 4.3
nt
main page.
6

1.4 Easy to use Search function is Excelle


4.4 4.5 4.3 4.8 4.5
provided, as needed. nt
1.5 Site accommodates novice to Excelle
4.2 4.3 4.1 4.7 4.3
expert users. nt
Very
II. Functionality 4.0 4.2 3.9 4.5 4.2
Good
Excelle
2.1 Functions are clearly labeled. 4.2 4.5 4.0 4.8 4.4
nt
2.2 Essential functions are available Very
3.9 4.2 3.9 4.6 4.1
without leaving the site. Good
2.3 Plug-ins are used only if they add Very
3.9 3.9 3.8 4.3 4.0
value Good
Very
III. User Control 4.1 4.2 4.0 4.4 4.2
Good
Very
3.1 Site reflects user’s workflow. 4.1 4.3 3.9 4.6 4.2
Good
Very
3.2 User can cancel any operation. 4.1 4.1 4.0 4.3 4.1
Good
3.3 Clear exit point is provided on Very
4.0 3.9 3.8 4.3 4.0
every page. Good
3.4 Per page loads moderately to Very
3.9 4.1 4.0 4.3 4.1
accommodate slow connections. Good
3.5 Currently used browser is Excelle
4.3 4.4 4.4 4.4 4.4
supported. nt
Excelle
IV. Language and Content 4.1 4.3 4.1 4.6 4.3
nt
4.1 Important information and tasks Excelle
4.2 4.4 4.0 4.7 4.3
are given prominence. nt
4.2 Information of low relevance or
Very
rarely used information is not 3.8 3.9 3.9 4.3 4.0
Good
included.
4.3 Related information or tasks are
grouped: on the same page or Excelle
4.2 4.4 4.3 4.7 4.4
menu or in the same area within a nt
page.
4.4 Language is simple, without Excelle
4.3 4.5 4.3 4.7 4.4
jargon. nt
Excelle
4.5 Paragraphs are brief. 4.2 4.4 4.3 4.7 4.4
nt
4.6 Links are concise, expressive, Excelle
4.1 4.4 4.0 4.7 4.3
and visible—not buried in text. nt
Very
4.7 Terms are defined. 4.0 4.3 3.9 4.6 4.2
Good
Very
V. Online Help and User Guides 4.0 3.9 3.7 4.3 4.0
Good
5.1 It is always clear what is 4.0 3.9 3.6 4.3 4.0 Very
7

happening on the site - - visual


Good
hints, etc.
5.2 Users can receive email feedback Very
3.7 3.8 3.6 4.3 3.8
if necessary. Good
5.3 Confirmation screen is provided Very
3.9 3.9 3.7 4.2 3.9
for form submittal. Good
Very
5.4 All system feedback is timely. 3.9 3.7 3.7 4.4 4.0
Good
5.5 Users are informed if a plug-in or Very
4.1 4.0 3.7 4.5 4.1
browser version is required. Good
5.6 Each page includes a “last Very
4.1 4.0 4.0 4.2 4.1
updated” date. Good
Excelle
VI. Consistency 4.2 4.3 4.1 4.6 4.3
nt
6.1 The same word or phrase is used Excelle
4.2 4.3 4.1 4.6 4.3
consistently to describe an item. nt
6.2 Link reflects the title of the page Excelle
4.2 4.3 4.1 4.5 4.3
to which it refers. nt
6.3 Browser page title is meaningful Very
4.1 4.2 4.0 4.6 4.2
and reflects main page heading. Good
VII. Error Prevention and Very
4.0 4.2 4.0 4.5 4.2
Correction Good
7.1 Users can rely on recognition, not
Very
memory, for successful use of the 4.1 4.2 4.1 4.6 4.2
Good
site.
7.2 Site tolerates a reasonable variety Very
4.0 4.1 3.8 4.6 4.1
of user actions. Good
7.3 Site provides concise instructions
Very
for user actions, including entry 4.1 4.2 3.9 4.6 4.2
Good
format.
7.4 Error messages are visible, not Very
4.1 4.2 3.9 4.4 4.2
hidden. Good
7.5 Error messages are in plain Very
4.1 4.2 4.1 4.3 4.2
language. Good
7.6 Error messages describe actions Very
3.9 4.1 4.0 4.5 4.1
to remedy a problem. Good
7.7 Error messages provide a clear Very
3.9 4.2 4.0 4.6 4.2
exit point. Good
VIII. Architectural and Visual Very
4.2 4.2 4.0 4.6 4.2
Clarity Good
8.1 Site is organized from the user’s Very
4.2 4.1 3.9 4.7 4.2
perspective. Good
8.2 Site is easily scannable for Very
4.2 4.2 4.0 4.6 4.2
organization and meaning. Good
8.3 Site design and layout are 4.0 4.2 4.0 4.6 4.2 Very
redundant only when required for Good
8

user productivity.
8.4 White space is sufficient; pages Very
4.0 4.0 4.0 4.4 4.1
are not too dense. Good
8.5 Unnecessary animation is Excelle
4.3 4.5 4.3 4.6 4.4
avoided. nt
8.6 Colors used for visited and
Very
unvisited links are easily seen and 4.2 4.1 4.0 4.6 4.2
Good
understood.
8.7 Bold and italic text is used Excelle
4.2 4.4 4.1 4.6 4.3
sparingly. nt
Very
AGGREGATE MEAN 4.2
Good
Based on the results as shown in Table 12 above and on the previous two
pages, the site showed great Web Usability: having Excellent rating in Navigation ( x =
4.3), Very Good rating in Functionality ( x = 4.2), Very Good rating in User Control ( x =
4.2), Excellent rating in Language and Content ( x = 4.3), Very Good rating Online Help
and User Guides ( x = 4.0), Excellent rating in Consistency ( x = 4.3), Very Good rating in
Error Prevention ( x =4.2), and Very Good rating in Architectural and Visual Clarity ( x =
4.2). Table 6 shows the aggregate mean and interpretation of each statement. The
aggregate mean of the web usability is 4.2 with the interpretation of “Very Good.” This
implies that the OPAC’s usability is very good. The result suggests also that the
respondents are very fulfilled and satisfied in terms of the navigation, consistency, and
the ease of use of the web page.

Table 13. Summary of Web Usability Evaluation

BISU Public Users (Non – BISU)


Components Professiona Total
Students Faculty/Staff Students
of Web l
Usability Descript Descrip Descrip Descrip Descrip
x x x x x
ion tion tion tion tion
Very Excelle Very 4. Excellen Excellen
I. Navigation 4.2 4.3 4.1 4.3
Good nt Good 7 t t
II. Very Very Very 4. Excellen Very
4.0 4.2 3.9 4.2
Functionality Good Good Good 5 t Good
III. User Very Very Very 4. Very Very
4.1 4.2 4.0 4.2
Control Good Good Good 4 Good Good
IV. Language Very Excelle Very 4. Excellen Excellen
4.1 4.3 4.1 4.3
and Content Good nt Good 6 t t
V. Online
Help and 4.0 Very Very Very 4. Excellen Very
3.9 3.7 4.0
User Good Good Good 3 t Good
Guides
VI. Very Excelle Very 4. Excellen Excellen
4.2 4.3 4.1 4.3
Consistency Good nt Good 6 t t
VII. Error 4.0 Very 4.2 Very 4.0 Very 4. Excellen 4.2 Very
9

Prevention
and Good Good Good 5 t Good
Correction
VIII.
Architectural Very Very Very 4. Excellen Very
4.2 4.2 4.0 4.2
and Visual Good Good Good 6 t Good
Clarity
Very
AGGREGATE MEAN 4.2
Good

C. WEB ACCESSIBILITY
The Web Accessibility used for the evaluation of Online Public Access Catalog
(OPAC) was the WAVE (http://wave.webaim.org/). It is a free web accessibility
evaluation tool powered by WebAIM (Web Accessibility in Mind). It is used to aid
humans in the Web accessibility evaluation process. Rather than providing a complex
technical report, WAVE shows the original Web page with embedded icons and
indicators that reveal the accessibility of that page. The tool works by scanning the
website the OPAC which is http://libtest.bisu.edu.ph. The tool examined the syntax and
structure of the website and determined if the code follows Web accessibility guidelines.
After scanning the site, the results showed that web page had no web accessibility
problems. Figure 23 below shows the actual result of the Web Accessibility Scan.

Figure 23. Web Accessibility Scan Result

CONCLUSIONS

After a thorough design and development, the proposed centralized library


system of BISU was successfully developed. The proposed system, BISU-Lib, is very
10

good in terms of usability and accessibility. The librarians and other intended users
believe and confident that the system is very usable. It has functions and features that
are highly acceptable by the intended users. The system enables the librarians to
maintain and organize library processes and information for better library business
decision-making.
The proposed system is integrated into one centralized server which is hosted
and implemented at the main campus of the university. Likewise, the proposed library
system offers modules that are highly acceptable by the librarians such as acquisition,
cataloging, circulation, OPAC and administration. Moreover, the graphical enterprise
reporting as business intelligence technique for decision-support was implemented that
is highly acceptable also by the librarians.

REFERENCES

[1] Allen, R. E. (1984).The Pocket Oxford Dictionary of Current English. Oxford:


Clarendon Press, p. 421

[2] Freeman, N. A. (2009).The Bookends Scenarios: Alternative futures for the Public
Library Network in NSW in 2030

[3] Breeding, M.(2012). Lowering the Threshold for Automation in Small Libraries,
Retrieved December 10, 2012, from http://www.questia.com/library/1G1-
286254700/lowering-the-threshold-for-automation-in-small-libraries

[4] Breeding, M. (2011).Library Web-Scale, Retrieved December 10, 2012, from


http://www.oclc.org/reports/webscale/libraries-at-webscale.pdf

[5] Wolk, T., Robert , M. (2006). Back to the Maxwell Library’s Future Student Library
and Information Resource Usage. Retrieved December 10, 2012, from
http://isedj.org/6/56/index.html

[6] Raisinghani, M. S. & Dupree, R. (2000). Global Digital Libraries: A Historical


Perspective and Architectural Considerations: AMCIS 2000 Proceedings ,
Retrieved December 10, 2012,from http://aisel.aisnet.org/amcis2000/192

[7] Library System of WISE, Retrieved December 10, 2012, from


https://github.com/tabinda/library-management-system

[8] Arlante, S. M. (2008). Quality Assurance at the University of the Philippines Library
System, Retrieved December 10, 2012, from
http://paarl.wikispaces.com/file/view/Arlante+-
+Quality+assurance+at+UP+Lib.pdf

[9] Ciciani, B. & Dias, D. (1990). A Hybrid Distributed Centralized System Structure for
Transaction Processing
11

[10] Reitz, J. M. (2011). Online Dictionary for Library Information Science, Retrieved


December 10, 2012, from http://www.abc-clio.com/ODLIS/odlis_A.aspx

[11] Cohen, L. B. (2013). Internet Tutorial: Your Basic Guide to Internet, Retrieved


December 10, 2012, from http://www.internettutorials.net/boolean.asp

[12] Rud, O. (2009). Business Intelligence Success Factors: Tools for Aligning Your
Business in the Global Economy. Hoboken, N.J: Wiley & Sons.

[13] Power, D. J. (2007). A Brief History of Decision Support Systems

[14] Reliable distributed systems: technologies, Web services, and applications,


Retrieved January 20, 2012, from http://www.books.google.ca

[15] Sadoski, D. (1997). Client/Server Software Architectures – An Overview, Software


Technology Roadmap

[16] Uzoka, F. M. & LIjatuyi, O. A. (2005). Decision support system for library
acquisitions: a framework

[17] UP, Integrated Library System (2008). Online Public Access Catalog (OPAC)
User’s Manual, Retrieved December 20, 2012, from
http://www.scribd.com/doc/61099353/iLib-Manual-OPAC

[18] Battaile, B. (1992). Circulation services in a small academic, Greenwood Press,


Retrieved December 20, 2012, from
http://books.google.com.ph/books/about/Circulation_services_in_a_small_acade
mic.html?id=aY8atNqBaA0C&redir_esc=y

[19] Library of Congress- Network Development and MARC Standards Office, Retrieved
April 17, 2013, from http://www.loc.gov/marc/

[20] Alabama A&M University, Administrative Offices, Retrieved December 20, 2012,
from
http://www.aamu.edu/administrativeoffices/library/public_services_departments/
pages/serials.aspx

[21] Ogunsola L. A. & Okusaga, T. O. (2008). Establishing Virtual Libraries in African


Universities: Problems and Prospects, Retrieved December 20, 2012, from
http://ozelacademy.com/OJSS_v1n1_5.pdf

[22] Scupola, A. (2007). Adoption of E-Services in Libraries: Danish Experiences and


Challenges

[23] Golding, P and Tennant, V. - Using RFID Inventory Reader at the Item-Level in a
Library Environment: Performance Benchmark, The Electronic Journal
12

Information Systems Evaluation Volume 13 Issue 2 2010, (pp107 - 120),


available online at www.ejise.com

[24] Delivery of Library’s Services through the Implementation of An Emerging


Information Technology: A Process Framework, Retrieved December 20, 2012,
from http://aisel.aisnet.org/pacis2010/189

[25] Wolk, T. & Robert M. (2010). Back to the Maxwell Library’s Future Student Library
and Information Resource Usage

[26] Kokabi, M. “Where was Information Ethics in Iranian Library and Information
Science Publications and Services?.” The Electronic Journal Information
Systems Evaluation Volume 12 Issue 1 2009, pp. 89 - 94, available online at
www.ejise.com

[27] Louisa, M.C.(2006). Meeting the challenges of expansion and renovation in an


academic medical library: A case study of the Li Ping Medical Library of the
Chinese University of Hong Kong, New Library World.

[28] Stanton, R. O. (1986). Applying the Management-by-Objectives Technique in an


Industrial Library, Journal of the American Society for Information Science

[29] Sami, L. & Pangannaiah, N. B. (2006). Technostress: A literature survey on the


effect of information technology on library users

[30] The University of Chicago Library, Retrieved December 20, 2012, from
http://www.lib.uchicago.edu/e/index.html#using

[31] University of Idaho Library, Retrieved December 20, 2012, from


http://www.lib.uidaho.edu/services/index.html

[32] Dauphine an International University in Paris, Dauphine University Paris,


Organization, Management and Decision Sciences, Retrieved December 20,
2012, from www.dauphine.fr/fileadmin/.../ParisDauphine%20UKHD_bassedef.pdf

[33] University Library System, The Chinese University of Hong Kong, Retrieved
December 20, 2012, from
http://www.lib.cuhk.edu.hk/Common/Reader/Channel/ShowPage.jsp?
Cid=344&Pid=8&Version=0&Charset=ascii_text&page=0

[34] Silliman University (SU) Library, Retrieved December 20, 2012, from
http://su.edu.ph/page/130-Silliman-University-Library-Electronic-Databases

[35] Holy Name University (HNU) Library System, Retrieved December 20, 2012, from
http://www.hnu.edu.ph/main/library/services.php
13

[36] USC Library System, Retrieved December 20, 2012, from


http://www.library.usc.edu.ph/

[37] USJ-R, Library Corporation, Retrieved December 20, 2012, from


http://opac.usjr.edu.ph/

[38] AdU Library Electronic Resources, Retrieved December 20, 2012, from
http://www.adamson.edu.ph/?page=library

[39] Miguel de Benavides Library, Retrieved December 20, 2012, from


http://library.ust.edu.ph/opac.htm

[40] Provincial Government of Cotabato, Philippines, Retrieved December 20, 2012,


from http://www.cotabatoprov.gov.ph/about-us/provincial-news/20-news/877-
cotabato-provincial-library-implements-computerized-library-system

[41] Follett Software Company, Retrieved December 20, 2012, from


www.follettsoftware.com

[42] Capterra, Inc., Library Automation Software Programs, Retrieved December 20,
2012, from http://www.capterra.com/library-automation-software?
srchid=936189&pos=1

[43] TLC (The Library Corporation), Retrieved December 20, 2012, from
http://www.tlcdelivers.com/tlc/what-we-do/library-automation.asp

[44] Athena, library automation software from SoftScout, Retrieved December 20, 2012,
from http://www.softscout.com/software/Public-Services-and-Utilities/Library/
Athena.html

[45] Access-It Library Software, The best solutions library, Retrieved December 20,
2012, from http://www.accessitsoftware.com/

[46] LibLime KOHA, Retrieved December 20, 2012, from http://www.koha.org/about

[47] Evergreen Open Source Software, Retrieved December 20, 2012, from
http://www.open-ils.org/

[48] Media Flex Open-source Automated Library System, Retrieved December 20,


2012, from http://www.mediaflex.net/index.jsp

[49] New Gen Lib Best Open Source Library System, Retrieved December 20, 2012,
from http://www.verussolutions.biz/web/

[50] Open Biblio a Library System (2012), Retrieved December 20, 2012, from
http://obiblio.sourceforge.net/
14

[51] Genove, P.G. M. (2011). Network Driven Budget Preparation and Monitoring
System

[52] Business Intelligence (BI) and Analytics Made for action. Retrieved April 17, 2013,
from http://www.targit.com/en

[53] Luhn, H. P. (1958). A Business Intelligence System, IBM Journal of Research and
Development

[54] Interfaces (1994). INFORMS, April 17, 2013, from


http://www.jstor.org/discover/10.2307/25061936?
uid=3738824&uid=2&uid=4&sid=21102220599867

[55] Paynter, R. A. (2009). Commercial Library Decision Support Systems: An Analysis


Based on Collection Managers' Needs

[56] McDonald, J. (1986). Designing a Decision Support System (DSS) for Academic
Library Managers Using Preprogrammed Application Software on a
Microcomputer

[57] Chorba, R. & Bommer, M. W.(1986). Developing Academic Library Decision


Support Systems

[58] Uzoka, F. M. (2005). Decision support system for library acquisitions: a framework

[59] Client - Server Architecture, Retrieved December 20, 2012, from


www.webopedia.com/TERM/C/client_server_architecture.html

[60] Ghosh, S. (2007). Distributed Systems – An Algorithmic Approach

[61] Core Desktop Solutions, Inc., Retrieved September 5, 2007, from


start.cortera.com/company/research/.../core-desktop-solutions-inc/

[62] Oz, E. (2002). Management Information Systems. Third Edition

[63] UML Forum, Retrieved October 14, 2011, from www.uml-forum.com/

[64] Definition of MARC, Retrieved October 14, 2011, from


http://www.ibiblio.org/msmckoy/marc2.html

[65] Enterprise Reporting, Pixel-and Print-Perfect Reports, Retrieved October 14, 2011,
from http://www.microstrategy.com/software/business-intelligence/enterprise-
reporting/
15

[66] Windows Server R2, Retrieved October 14, 2011, from


http://searchwindowsserver.techtarget.com/definition/Windows-Server-2008-R2

[67] Interview with Kai Seidler, Retrieved August 22, 2012, from
http://sourceforge.net/projects/xampp/

[68] The PHP Group, Retrieved December 20, 2012, from http://www.php.net/

[69] Don Ho: HTML and CSS, Retrieved December 20, 2012, from http://notepad-plus-
plus.org/

[70] Microsoft, Visual Studio .NET 2003, Retrieved December 20, 2012, from
http://msdn.microsoft.com/en-us/library/aa669223(v=vs.71).aspx

[71] MySQL, Oracle Corporation, Retrieved December 20, 2012, from


http://dev.mysql.com/downloads/connector/

[72] Microsoft, Retrieved December 20, 2012, from


http://office.microsoft.com/en-us/visio/business-and-software-diagrams-visio-
professional-FX103472299.aspx

[73] DYNAMIC SOLUTIONS, INC. , Retrieved December 20, 2012, from


http://proceedings.esri.com/library/userconf/proc98/proceed/to250/pap241/
p241.htm

[74] Sommerville, I. (2006). Software Engineering, 8th edition

[75] Bitpipe, R. S. (2012). BitApplication Development White Papers: Software


Downloads, Definition and Webcasts.

[76] Writing Software Security Test Cases - Putting security test cases into your test
plan, Retrieved December 20, 2012, from http://www.qasec.com/2007/01/writing-
software-security-test-cases.html

[77] Database Modeling in UML, Retrieved April 17, 2013, from


www.sparxsystems.com/resources/uml_datamodel.html

[78] Web hosting, Retrieved April 17, 2013, from http://foldoc.org/web+hosting

[79] Software Testing Stuff: How to do System Testing, Retrieved April 17, 2013, from
http://www.softwaretestingstuff.com/2009/12/how-to-do-system-testing.html

[80] System Evaluation, Retrieved April 17, 2013, from


http://www.modustrading.com/WhatIsSystemEvaluation.htm

You might also like