You are on page 1of 22

Secure Digital Archiving

A SOFTWARE ENGINEERING PROJECT (2R690) at the EINDHOVEN UNIVERSITY OF TECHNOLOGY during 2001-2002 by

SOFTWARE VERIFICATION AND VALIDATION DOCUMENT


VERSION 1-0-0

Secure Digital Archiving Software Verification and Validation Plan Page 2

A. ABSTRACT
This document describes the Software Verification and Validation Plan (SVVP) for the SDA project and was made according to the ESA software-engineering standards. In this document the projects verification and validation activities are described. The SDA project is part of the Software Engineering Project and is one of the seven assignments for the course 2R690 at Eindhoven University of Technology.

Secure Digital Archiving Software Verification and Validation Plan Page 3

B. TABLE OF CONTENTS
A. B. C. D. 1 ABSTRACT _____________________________________________________ 2 TABLE OF CONTENTS ___________________________________________ 3 DOCUMENT STATUS SHEET _____________________________________ 4 DOCUMENT CHANGE RECORD___________________________________ 5 INTRODUCTION ________________________________________________ 6 1.1 Purpose _____________________________________________________ 6 1.2 Scope_______________________________________________________ 6 1.3 Definition, acronyms and abbreviations ____________________________ 6 1.4 References___________________________________________________ 7 2 VERIFICATION OVERVIEW ______________________________________ 8 2.1 Organization _________________________________________________ 8 2.1.1 Internal reviews __________________________________________ 8 2.1.2 External reviews __________________________________________ 8 2.1.3 Audits __________________________________________________ 9 2.1.4 Tests ___________________________________________________ 9 2.2 Schedule ___________________________________________________ 10 2.3 Resources __________________________________________________ 11 2.4 Responsibilities______________________________________________ 11 2.5 Tools, techniques and methods__________________________________ 11 3 ADMINISTRATIVE PROCEDURES ________________________________ 12 3.1 Anomaly reporting and resolution _______________________________ 12 3.2 Task iteration policy __________________________________________ 12 3.3 Deviation policy _____________________________________________ 12 3.4 Control procedures ___________________________________________ 12 3.5 Standards___________________________________________________ 12 4 VERIFICATION ACTIVITIES _____________________________________ 14 4.1 Reviews____________________________________________________ 14 4.1.1 Internal Review__________________________________________ 14 4.1.2 External Review _________________________________________ 15 4.2 Formal proofs _______________________________________________ 17 4.3 Tracing ____________________________________________________ 17 5 VERIFICATION REPORTING_____________________________________ 19 APPENDICES ______________________________________________________ 20 A SR PHASE _______________________________________________________ 20 A.1 Requirement identification _____________________________________ 20 A.2 Traceability tables____________________________________________ 20 B AD PHASE _______________________________________________________ 22 B.1 Traceability tables____________________________________________ 22

Secure Digital Archiving Software Verification and Validation Plan Page 4

C. DOCUMENT STATUS SHEET


Document name Document Location Author(s) Version Status Version 0-0-1 0-0-2 0-1-0 0-1-1 1-0-0 Software Verification and Validation Document SVVP/SVVP-1-0-0.* Kees van der Sluijs, Onno Schutze 0-1-1 draft / internally accepted / conditionally approved / appr oved Author(s) Kees van der Sluijs, Onno Schutze Kees van der Sluijs Kees van der Sluijs Kees van der Sluijs Kees van der Sluijs Summary Document creation Added SR appendix Corrections applied after internal review Added AD appendix Upgrade to approved.

Date 13-1-2002 3-2-2002 4-2-2002 15-2-2002 8-6-2002

Secure Digital Archiving Software Verification and Validation Plan Page 5

D. DOCUMENT CHANGE RECORD


Document Title Document Identification Page
Software Verification and Validation Plan

(SVVP) SVVP/SVVP-1-0-0.* Par agr aph Reason for change

Secure Digital Archiving Software Verification and Validation Plan Page 6

1 INTRODUCTION
1.1 Pur pose
This document describes procedures concerning the testing of the delivered products (product documents and software) of the SDA-project for compliance with the requirements. Testing is the execution of a system or component under specified conditions, the observation or recording of the results and evaluation of these results. The requirements that the software is to be verified against can be found in the product documents [URD], [SRD], [ADD] and [DDD]. The modules to be verified and validated are defined in the AD phase. The goal of verifying and validating is to check whether the software product to be delivered conforms to the requirements of the client and to ensure a minimal number of errors in the software. This project document is written for managers and developers of the SDA project.

10

1.2 Scope
The software to be made is called Secure Digital Archiving System (SDA-system). The main aim of the SDA-system is to manage the storage of digital documents in such a way that the documents can be stored and maintained in a safe and easy manner. The physical storage of all the documents might be outsourced with a third party. A third party should not be able to read or alter the content of the archive. However, within the organization colleague and/or managers should be able to read/modify documents merely based on their role in the organization. It should also be possible that a user can grant rights over the documents to another user.

15

1.3 Definition, acr onyms and abbr eviations


2R690 AD ADD Argos AT ATP CI CM CMP DD DDD ESA IT ITP PM QM SDA SM SPMP SQA The Software Engineering Course Architectural Design Architectural Design Document Group-name Acceptance Test Acceptance Test Plan Configuration Item Configuration Manager Software Configuration Management Plan Detailed Design Detailed Design Document European Space Agency Integration Test Integration Test Plan Project Manager Quality Manager Secure Digital Archiving System Senior Management Software Project Manager Plan Software Quality Assurance

Secure Digital Archiving Software Verification and Validation Plan Page 7 SQA team SQAP SR SRD ST STP SVVP TR UR URD UTP Kees van der Sluijs and Richard Peters Software Quality Assurance Plan Software Requirements Software Requirements Document System Test System Test Plan Software Verification and Validation Plan Transfer User Requirements User Requirements Document Unit Test Unit Test Plan

1.4 Refer ences


[ADD] Architectural Design Document ADD\ Acceptance Test Plan of SDA Project Bram Peeters ATP\ Detailed Design Document of SDA Project DDD\ ESA Software Engineering Standards, ESA PSS-05-02, Issue March 1995, ESA Board of Software Standardisation and Control (BSSC), ISBN 0-13-106568-8. Lecture Notes Software Engineering (2R690) Dr. L.J.A.M. Somers, http://wwwis.win.tue.nl/2R690/se_cont.html Software Configuration Management Plan of SDA Project Jeroen van de Wulp, Guido Dorssers SCMP\ Software Project Management Plan of SDA Project Loek Simons SPMP\ Software Quality Assurance Plan of SDA Project Kees van der Sluijs, Richard Peters SQAP\ Software Requirements Document of SDA Project Jasper van Kempen, Richard Peters, Ling-Zhen Wu SRD\ User Requirements Document of SDA Project Ronald Kruidhof, Jeroen van der Wulp, Ling-Zhen Wu URD\

[ATP]

[DDD]

[ESA]

[LECT]

[SCMP]

[SPMP]

[SQAP]

[SRD]

[URD]

Secure Digital Archiving Software Verification and Validation Plan Page 8

2 VERIFICATION OVERVIEW
2.1 Or ganization
20 This section will describe the organization of reviews, audits and tests. The exact procedures and standards of reviews are described in chapter 4. Here only the attendees are listed.

2.1.1 Inter nal r eviews


Because the structure of a team reviewing a technical document differs from the structure of a team reviewing a management document, both cases are mentioned. 25 The team carrying out the internal review of a technical document will at least consist of the following persons: 30 If the member of the SQA team is also a member of the authoring team an additional Argos member (not member of the authoring team) has to attend the internal review. The team carrying out the internal review of a management document will consist of the following persons: The Quality Manager The Project Manager The Configuration Manager A member of the SQA team The leader of the team responsible for the document At least one other member of Argos

35

2.1.2 Exter nal r eviews


40 The team carrying out the external review of a technical document will consist of the following persons: A member of the SQA team (preferably QM) The leader of the team responsible for the document The Project Manager (optional) At least one member of the SM At least two experts not being the advisor The URD, SRD and ATP reviews must be attended by at least one representative of the client The team carrying out the external review of a management document will consist of the following persons: At least one representative of the Senior Management Peter Veltkamp The Project Manager The Quality Manager

45

50

55

Secure Digital Archiving Software Verification and Validation Plan Page 9 The team leader of the authoring team

2.1.3 Audits
Audits are reviews that assess compliance with software requirements, specifications, baselines, standards, procedures, instructions, codes and contractual and licensing requirements. Physical audits check that all items identified as being part of the configuration are present in the product baseline. A functional audit checks that unit tests, integration tests and system tests have been carried out and records their success or failure. Functional and physical audits can be performed before the release of the software (ESA Software Engineering Standards 4.2.5. Auditing [ESA]). The SM is allowed to audit the Argos project to check if the procedures, as described in the management documents SPMP, SQAP, SVVP and SCMP, are followed. Audits are not routine checks, but the SM can request them. The following rules apply to all audits: 70 75 Only the SM can request audits. Audit requests must be directed to the PM. In the audit request the following information must be included: Names of the auditors (at least two persons) People required to be present Possible dates of the audit Purpose of the audit Items that will be checked The audit is attended by the a member of the SQA Team (preferably QM), the PM and possibly others as indicated by SM. The results of the audit are reported in a written report to the PM and the QM within one week of the audit. This report must contain the following information: Date of the audit Participants in the audit Checked items Conclusion Recommendations

60

65

80

85

2.1.4 Tests
90 The testing is done following the ESA Life cycle verification approach [ESA] (figure 1). Verification and validation applies to every phase of the Argos project. Output of a phase is checked during a review as described in chapter 4. It is the responsibility of the QM to plan sufficient reviews for all documents. Test plans are made as soon as the corresponding specification is completed. Test plans outline the approach to testing and are essential for estimating the resources required completing the project. Test designs, test cases and test procedures are defined and included in this document. Tests are then executed and results are recorded as described in the test plans. Testing consists of the following parts: Planning: creation of a test plan.

95

Secure Digital Archiving Software Verification and Validation Plan Page 10 100 Designing: detailed design of a test plan. Execution: execution of the test described in the test plan. Reporting: writing the test reports. Problem solving: correcting the errors found.

105
Figur e 1: ESA Life cycle ver ification appr oach

The planning of the ATP will be done in the UR phase. A preliminary test design will be made in the SR phase. The final design will be made in the DD phase. Testing will be done in the TR phase. The [ATP] has to be approved by the client. 110 The planning of the STP will be done in the SR phase. It will be designed in the AD phase. Testing will be done in the DD phase. The planning of the ITP will be done in the AD phase. The design and testing will be done in the DD phase. The planning, designing and testing of the UTP will be done in the DD phase. The results of all tests are presented to the PM and the QM.

115

2.2 Schedule
120 The schedules for all phases are given in the [SPMP, chapter 4.2].

Secure Digital Archiving Software Verification and Validation Plan Page 11

2.3 Resour ces


Room 1.04 of the Matrix building at the TU/e is available for project meetings, internal-, external reviews and all other activities concerning the SDA project. Three computers with Windows 2000 pre-installed are available. Network connectivity, a phone, a A4/A3 printer are also available. The servers containing the Argos documents are also housed in 1.04 and are permanent accessible from any computer with an internet connection.

125

2.4 Responsibilities
The PM is responsible for the progress of the entire project. He monitors the progress as described in the [SPMP, chapter 3.4]. The team leader is responsible for the validation of the CIs created by his team. The leader of the test team is responsible for the verification of the CIs. The SM may check the test procedures and results during an audit.

130

2.5 Tools, techniques and methods


The tools used in support of reviews are: 135 Word processing (Microsoft Word 2000 or compatible and Word to PDF-converter) Electronic mail system and Review reports

The review reports are used to record the outcome of an external review.

Secure Digital Archiving Software Verification and Validation Plan Page 12

3 ADMINISTRATIVE PROCEDURES
3.1 Anomaly r epor ting and r esolution
140 Everything that is not up to standards or does not conform to requirements is an anomaly. Procedures for anomaly resolution can be found in the [SQAP, chapter 7]. Furthermore, it is the task of the SQA team to monitor whether the procedures as defined in the management plans ([SPMP], [SCMP], [SQAP] and [SVVP]) are followed. This is done during team leader meetings, reviews and by randomly checking CIs. Findings are reported to the PM. It then is the responsibility of the PM to enforce compliance with defined procedures. If the results of the PMs actions are not satisfactory to the QM, he can request the senior management to take further action.

145

3.2 Task iter ation policy


150 Every task performed is to be reviewed as described in chapter 4. If during this review problems are discovered concerning the correct conclusion of the task, a decision is made concerning the iteration of the task, guidelines for the following cases are: 155 The team responsible was unable to complete their task, for example because of a lack of knowledge or manpower. In this case, it is the responsibility of the team leader to solve the problem and make sure the task is completed as described in the [SPMP]. If the team leader is unable to do so, he must report this to the PM and QM. If problems arise concerning the dependencies between tasks these are to be reported to the PM and QM. A structural error was found in the execution of the task, for example that the output of a piece of code does not comply with the requirements. In this case, the team responsible performs the task again. If necessary the PM schedules extra man-hours. An item was forgotten during the execution of a task. Depending on the severity of this item, the PM decides whether the entire task or only a part of the task needs to be redone or that no action is taken.

160 165

3.3 Deviation policy


During the SDA project, the procedures described in the management documents are followed. However, if in the QM's opinion, this endangers the completion of the project then the QM can decide to deviate from these procedures. If the decision is made to deviate from the procedures described in the management documents, the PM must be informed of such a deviation.

170

3.4 Contr ol pr ocedur es


Procedures assuring that CIs are not accidentally or deliberately changed are described in the [SCMP, chapter 4.3].

3.5 Standar ds
Procedures concerning coding conventions are defined in the [SQAP, chapter 4].

Secure Digital Archiving Software Verification and Validation Plan Page 13 The [SCMP, chapter 3.1] defines the identification conventions for documents and software components.

175

Secure Digital Archiving Software Verification and Validation Plan Page 14

4 VERIFICATION ACTIVITIES
4.1 Reviews
Review procedures are held during all phases of the SDA project. CIs are reviewed in the phase they are delivered; an overview of which CI is delivered in which phase can be found in the [SPMP, chapter 4]. All project and product documents have one of the following statuses: 180 185 These statuses can also apply to a part of a document. For example a document can be internally approved for the first 5 chapters. The following chapters (appendices) are reviewed separately. When they are internally approved, they can be added to the document, which then becomes internally approved up to and including the added chapters (appendices). For a (part of a) document to become approved it has to be reviewed. The procedure for reviewing the documents is as follows. (Note that the same procedure applies to parts of a document when reviewed.) Draft (initial status) Internally accepted Conditionally approved Approved

190

4.1.1 Inter nal Review


Internal reviews of technical documents are attended by: 195 The Quality Manager (reviewer) The leader of the team responsible for the document (leader) At least one other member of Argos (reviewer)

The team carrying out the internal review of a management document will consist of the following persons: The Quality Manager The Project Manager The Configuration Manager

Table 4-1 shows the action list for the preparation and execution of the internal reviews of (all) documents. T is the time of the review meeting. The terms leader, reviewer and QA in the who-column correspond with the attendants mentioned in the previous paragraph. 200 Number Who 1 QM 2 Leader What Set a date for the internal review of the document Deliver the review version of the document to the reviewers When

T 2 days

Secure Digital Archiving Software Verification and Validation Plan Page 15 to the reviewers Inspect the document (language errors are underlined) Discuss all errors other than language errors Write down all necessary changes Decide if the document can be approved, provided the stated changes are made If the document cannot be approved, an appointment for a new review meeting is made Collect annotated documents See to it that the stated remarks are handled properly by the team delivering the document Grant the document the status internally accepted if all requested changes are made

3 4 5 6 7

Reviewer Reviewer Leader Reviewer QM

Before T T T T T

8 9

Leader QM

T After T

10

QM

After T

Table 4.1: Action list for inter nal r eviews of documents

4.1.2 Exter nal Review


External reviews of technical documents are attended by: 205 210 A member of the SQA team (preferably QM) who acts as chairman (chairman) The leader of the team responsible for the document who acts as a minutes secretary (leader) The Project Manager (optional) At least one member of the SM (reviewer) At least two experts not being the advisor (reviewer) The URD, SRD and ATP reviews must be attended by at least one representative of the client (reviewer)

215

Table 4-2 in section 4.1.2 shows the action list for the preparation and execution of the external reviews of documents. T is the time of the review meeting. The terms leader, reviewer and chairman in the who-column correspond with the attendants mentioned in the previous paragraph. The duration of the review is one hour (plus an additional quarter of an hour if necessary). The review meeting is held in HG 6.05. Number Who 1 PM 2 Leader What Arrange the external review When After the internal acceptance. T 3 workdays

Deliver the paper version of the document along with a return address to the TU/e mailbox of all reviewers (except Lou Somers, who wants the PDF-file mailed to lsom@oce.nl) Reviewer Inspect the document (language errors are indicated on the copy of the document)

Between T 3 workdays and T 2 workdays

Secure Digital Archiving Software Verification and Validation Plan Page 16 2 workdays T 2 workdays

Reviewer May deliver a summary of their comments on the document 6 Leader Deliver the comments to the chairman T 1 workday 8 Chairman Lead the meeting by a walkthrough through T the document and make sure all the comments on the list are discussed 9 Chairman Assure that there is time for the final T decision 10 Chairman Keep discussions to the point T 11 Leader Document everything that is discussed during the review 12 Reviewer Discuss all comments that need explanation T or discussion (no time will be spent on language errors and self-explanatory comments) 13 Leader Collect the annotated copies of the T document for the handling of language errors 14 Reviewer Decide the status of the document at the end T of the meeting. There are three possible outcomes: the document is rejected and a new appointment is made the document is accepted and the SM grants the status Approved The document is accepted conditionally (this means that it will be accepted when all the changes stated during the review meeting are handled properly. No new review needs to be held. The corrected version of the document is sent to the reviewers as described in action 2 and the reviewers inform about their final acceptance via e-mail ) If the document is not unconditionally accepted (this is the case when the document is r ejected or accepted conditionally) actions 15 and 16 apply. If the document is accepted actions 15 and 16 dont apply. 15 QM See to it that the remarks are handled After T properly by the team responsible for the document Grant the document the status Approved After T if all reviewers inform that their remarks are handled properly

16

QM

Table 4.2: Action list for the exter nal r eview of documents

Secure Digital Archiving Software Verification and Validation Plan Page 17

Location in document The location is a (one point-) interval of line numbers, or of section/subsection numbers, or of page numbers, or general 220

Severity The severity is indicated by an integer number in the domain [1..5]. Five being the most severe

Comments Comments summarise the discrepancy. Further explanation is given during the meeting. This will be done as concretely and precisely as possible

Table 4.3: The columns in the r eview meeting discussion list

The team carrying out the external review of a management document will consist of the following persons: At least one representative of the Senior Management Peter Veltkamp The Project Manager The Quality Manager The team leader of the authoring team

225

230

The external review of a management document resembles the external review of a technical document, with the difference that there is no actual meeting. Everything is handled through correspondence.

4.2 For mal pr oofs


Formal proof will be given where considered necessary by the SVVP team.

4.3 Tr acing
235 During the SDA project the relation between the input and the output of a phase must be checked several times. A traceability table is included in the validation report, which is added to each product CI (except the URD). In this table the CI is traced to the input of the phase. During the software life cycle it is necessary to trace: 240 245 250 User requirements to software requirements and vice-versa, this is checked during reviews in the SR-phase. Software requirements to component requirements and vice-versa, this is checked during reviews in the AD-phase. Integration tests to architectural units and vice-versa, this is described in the integration test plans. These tests are performed during the DD-phase. Unit tests to the modules of the detailed design, this is described in the unit test plans. These tests are performed during the DD-phase. System tests to software requirements and vice-versa, this is described in the system test plans. These plans are executed during the DD-phase. Acceptance tests to user requirements and vice-versa, this is described in the acceptance test plans. These tests are executed during the TR-phase.

Secure Digital Archiving Software Verification and Validation Plan Page 18 To support traceability, all requirements are identified. Requirements are identified as follows: 255 DCatRCatNo where DCat defines the category of the requirement. Allowed values are: UR (User Requirement [URD]) SR (Software Requirement [SRD]) RCat defines the subcategory of the requirement. Allowed values if the DCat value is UR are: 265 CAR (Capability Requirement) COR (Constraint Requirement)

260

270

'No' is a three-digit number of each requirement in ascending order, starting at 001. Example: URCOR007 identifies constraint requirement seven of the user requirements.

Secure Digital Archiving Software Verification and Validation Plan Page 19

5 VERIFICATION REPORTING
For the verification and validation of technical CIs (apart from the [URD]) two parts are added to these CIs: A verification report A validation report

275

280

These reports are presented to and checked by a member of the SQA team. The people performing the test of the CI write the verification report. The people who delivered the CI write the validation report. These are both checked when the CI is reviewed. A validation report is written as a result of the tracing. It contains a traceability table. A verification report is written as a result of a test. It contains the following items: Unique reference number of the test plan Problems discovered and, if available, solutions to these Acceptance or disapproval of the CI. In case of disapproval, accompanied with a short explanation of the reasons of disapproval

285

290

For the verification and validation of documents, internal and external reviews are held. For the verification and validation of the entire SDA project, progress meetings are held with the SM every two weeks on Monday [SPMP, section 3.4].

Secure Digital Archiving Software Verification and Validation Plan Page 20

APPENDICES A SR PHASE
For the second phase of the project (SR), the SQA team must verify the output of the SRD against the requirements of the URD.

A.1
295

Requir ement identification

To support traceability, all requirements are identified. Requirements are identified as follows: DCatRCatNo where DCat defines the category of the requirement. Allowed values are:

300 305 UR (User Requirement [URD]) SR (Software Requirement [SRD])

RCat defines the subcategory of the requirement. Allowed values if the DCat value is SR are: FUR (Functional Requirement) NFR (Non-functional Requirement)

310

'No' is a three-digit number of each requirement in ascending order, starting at 001. Example: SRFUR008 identifies functional requirement eight of the software requirements.

A.2
315

Tr aceability tables

As part of the tracing of the [SRD] to the [URD] (and back) two traceability tables are constructed: One table tracing each of the user requir ements to a softwar e r equir ement (example in table A-1). In the first column the user requirements are listed in alphabetical order. In the second column the corresponding software requirements are listed. One table tracing each of the softwar e r equir ements to a user r equir ement (example in table A-2). In the first column the software requirements are listed in alphabetical order. In the second column the corresponding user requirements are listed. This table is in fact the first table with the columns reversed and sorted by the software requirements. User requirement URCAR001 URCAR002 URCAR003 Software Requirement SRFUR003 SRFUR004 SRFUR001

320

325

Secure Digital Archiving Software Verification and Validation Plan Page 21 URCAR004 URCOR001 URCOR002 SRNFR001 SRFUR002 SRNFR002

Table A.1: Example of a tr aceability table tr acing user to softwar e r equir ements

Software Requirement SRFUR001 SRFUR002 SRFUR003 SRFUR004 SRNFR001 SRNFR002

User requirement URCAR003 URCOR001 URCAR001 URCAR002 URCAR004 URCOR002

Tabel A.2: Example of a tr aceability table tr acing softwar e to user r equir ements

Secure Digital Archiving Software Verification and Validation Plan Page 22

B AD PHASE
For the third phase of the project (AD), the SQA team must verify the output of the ADD against the requirements of the SRD.

B.1
330

Tr aceability tables

335

340

As part of the tracing of the [ADD] to the [SRD] (and back) two traceability tables are constructed: One table tracing each of the software requirements to a component (example in table B-1). In the first column the software requirements are listed in alphabetical order. In the second column the components that implement them are listed. One table tracing each of the components to the software requirements it implements (example in table B-2). In the first column the components are listed in alphabetical order. In the second column the software requirements that are implemented by them are listed. This table is in fact the first table with the columns reversed and sorted by the components. User requirement SRFUR001 SRFUR002 SRFUR003 SRFUR004 SRNFR001 SRNFR002 Software Requirement ComponentA ComponentB ComponentB SDACore SDACore ComponentB

Table B.1: Example of a tr aceability table tr acing softwar e r equir ements to components

Software Requirement ComponentA ComponentB

SDACore

User requirement SRFUR001 SRFUR002 SRFUR003 SRNFR002 SRFUR004 SRNFR001

Tabel B.2: Example of a tr aceability table tr acing components to softwar e r equir ements

You might also like