(IJCSIS) International Journal of Computer Science and Information Security,Vol. 9, No. 5, May 2011
Criteria Document as a high level guide, the DetailedEvaluation Methods and Tools document outlines the specificmethods proposed and references the criteria that thesemethods address. The latter document states:Our approach is based not only on the quality standards towhich JISC require the 22 projects to adhere and theindividual standards which have been identified by theprojects themselves, but also the maintainability, extendibilityand the general robustness of the 22 projects' code andsoftware. Code review will be a key focus of the evaluation of each project. Other evaluation and testing methods will varyaccording to the project and its outputs.
Summary of methods used
We used the following methods:1. Brief Online Study.2. Face to face interview with project staff and follow up byemail and phone (full accounts of interviews appear inAppendix).3. Selective checking of sections of code (per-projectsummary of code checking results appears below). 4. Lab testing the software on various platforms and in variousconditions (per-project summary of test results appears below– full results have been made available to projects and toprogramme staff).5. Usability walkthrough and application of heuristics (per-project summary of usability results appears below – fullresults have been made available to projects and to programmestaff).6. Brief and selective desk audit of project documentation,version tracking, and testing procedures (the results of theaudit were fed into the software evaluation and gave the testersinsight into which areas of the tool might need particularattention, also alerting the testing team to any issues and bugsprior to testing and informing some project-specific questionsasked during the interviews).
RITERIA USED IN EVALUATION
The criteria used in the evaluation and a brief summary of the measures or "questions asked" appears below.
Match of actual software output to planned output Measures
Are the outputs materially different from those planned? Arethey so delayed that the project will not, or is unlikely to,deliver those outputs?
Use of quality plan Measures
Has the quality plan been completed and followed? Has itbeen used to aid implementing the JISC Software QualityAssurance Policy? Has it been updated? Has it been used increative and unforeseen ways?
Compliance with the JISC Open Source Policy Measures
The (draft) JISC Open Source Policy (May 2004) itself detailsthe necessary areas of compliance.
Compliance with Open Standards Measures
Have the Open Standards outlined in the quality plan beenused? If not, have other Open Standards been used?
Quality control procedures Measures
Are there any documented formal or informal quality controlprocedures? Is there evidence of them being used? Can wereproduce testing procedures? Are issues and changes beingdocumented?
Project specific documentation Measures
In addition to the project plan and quality plan, we asked eachproject to give us access to:• The technical specification of their product• a summary of their testing procedures• a copy of their test plan• reports from version tracking software• code documentation. It was anticipated that there would be little or no end-userdocumentation, and that few projects would have a completetechnical specification document.3.Results: