148College &ResearchLibraries March 2006
Although few articles have presentedusability tests of customizable libraryportals, numerous usability studies mea-suring the eﬀectiveness of library Websites provide templates that researcherscan use as models in measuring the easeof use and functionality of library Webportals. Jeffry Rubin offered practicaland comprehensive instructions for theusability testing process in the
Handbookof Usability Testing: How to Plan, Design,and Conduct Eﬀective Tests.
Elaina Norlinand CM! Winters also oﬀered a practicalapproach to usability testing in a libraryseing in
Usability Testing for Library WebSites: A Hands-on Guide
Susan Augus-tine and Courtney Greene supported theusability testing method as an eﬀectivemeans of gathering both quantitative andqualitative feedback about the design of alibrary Web site.
Augustine and Greenemeasured the amount of time and numberof clicks required to perform a given taskagainst a benchmark “expert” value.
Inaddition, they stressed the importance ofrecording the verbal feedback of users asthey perform each task.
Louise McGillisand Elaine G. Toms, Ruth Dickstein andVictoria A. Mills, Barbara J. Cockrell andElaine Andreson Jayne, Brenda Baleson,Austin Booth, and Jane Weintrop, and Janet K. Chisman, Karen R. Diller andSharon L. Walbridge all also oﬀered prac-tical models for assessing the usabilityof library Web sites and search tools.
These studies concur that usability testingof a group of no more than eight to tensubjects is an eﬀective and cost-eﬃcientmeans of gathering data pointing to prob-lems in Web site functionality, design, andterminology.In 2000, Todd Zazelenchuk and JamesLane released the results of a usabilitystudy of the OneStart Portal, a prototypeof a campuswide information portal forIndiana University.
Although OneStartwas not a library Web portal, the studymeasured user satisfaction with custom-ization features unique to online portals,thus oﬀering a practical model for thecollection and compilation of usabilitystudy test data. In this study, usabilitytest scores were divided into categories(e.g., overall ﬂexibility, clarity of terms).
As echoed in other Web site usabilitystudies, “clarity of terms” proved to bean area in which users voiced greatest dis-satisfaction with the portal. Most usabilitystudies present participants with a seriesof explicit tasks. In a departure from thismethod, Zazelenchuk and Lane presentednine users with a printout of an alreadycustomized portal page and asked themto manipulate an uncustomized portalpage until it matched the printout.
Thismethod is limited to usability tests ofcustomizable applications. The authorsclaimed that this method eliminates thelevel of coaching implicit in usability teststhat outline speciﬁc tasks in detail.
In 2001, Justin Dopke and Gary Mar-chionini published the results of a us-ability test of the North Carolina StateLibrary StartSquad Web Portal for Chil-dren.
This test gathered and synthesizedfeedback of eight test subjects rangingfrom preschool to middle-school age.Usability study tasks were designed tomeasure suitability and recognizabilityof interface graphics, top-level navigationfunctions, information retrieval functions,and overall satisfaction with the interface.Due to the intended age of the audience,the study has somewhat limited appli-cability to the testing of a college-leveltool such as the academic version of MyChicago Library. However, the authors’classiﬁcation of task types can be easilyapplied to evaluative tests of academiclibrary Web portals.Since the team at North Carolina StateUniversity conceived of and releasedthe MyLibrary so�ware, numerous casestudies and anecdotal articles have beenpublished that recount the experiencesof libraries implementing the so�ware.These articles oﬀer insight into issuesranging from the initial workload re-quired for implementing MyLibrary tofeedback from patrons about the useful-ness of having a customizable library Webportal at their disposal. These articles are