SOFTWARE QUALITY ASSURANCE PLAN TEMPLATE

(BASED ON ANSI/IEEE STD 730.1-1989)

1.0 1.1 1.2 1.3 1.4 2.0 3.0

..................................................................................................... ................................................................ INTRODUCTION ..................................................................................................... 1 PURPOSE ............................................................................................................... 1 SCOPE ................................................................................................................... 1 SOFTWARE ITEMS.................................................................................................... 1 SOFTWARE LIFE CYCLE ............................................................................................ 2 DOCUMENTS ................................................................ ..................................................... REFERENCE DOCUMENTS ..................................................................................... 2 ....................................................................................................... ................................................................ MANAGEMENT ....................................................................................................... 5

3.1 ORGANIZATION ....................................................................................................... 5 3.1.1 Organizational Structure ................................................................................. 5 3.1.2 Organizational Description .............................................................................. 5 3.1.3 Organizational Independence.......................................................................... 6 3.2 TASKS ................................................................................................................... 6 3.2.1 Software Life Cycle ......................................................................................... 6 3.2.2 SQA Activities.................................................................................................. 8 3.2.3 Milestones...................................................................................................... 10 3.3 RESPONSIBILITIES ................................................................................................. 12 3.3.1 Software Activities ........................................................................................ 12 3.3.2 Software Work Products................................................................................ 12 3.3.3 Walkthroughs Software Work Products ........................................................ 13 3.3.4 Inspections of Software Work Products........................................................ 13 4.0 ................................................................ ............................................................... DOCUMENTATION ............................................................................................... 14

4.1 PURPOSE ............................................................................................................. 14 4.2 MINIMUM DOCUMENTATION REQUIREMENTS ............................................................. 14 4.2.1 Software Requirements Document (SRD) ..................................................... 14 4.2.2 Software Architecture Description (SAD) ..................................................... 16 4.2.3 Software Verification and Validation Plan (SVVP)........................................ 16 4.2.4 Software Verification and Validation Report (SVVR).................................... 18 4.2.5 User Documentation Description (UDD)........................................................ 19 4.2.6 Software Configuration Management Plan (SCMP)....................................... 21 4.3 OTHER ................................................................................................................ 22 4.3.1 Software Project Plan (SPP) .......................................................................... 22 4.3.2 System Requirements Specification (SRS) ................................................... 24 4.3.3 System Architecture and Requirements Allocation Description (SARAD) ... 25 4.3.4 Database Design Description (DDD) .............................................................. 25 4.3.5 Software Interface Design Description (SIDD).............................................. 26 4.3.6 Test or Validation Plan (TVPL) ...................................................................... 26 4.3.7 Software Design Description (SDD)............................................................... 27 4.3.8 Test or Validation Procedures (TVPR) .......................................................... 27 4.3.9 Test or Validation Results Report (TVRR)..................................................... 28 4.3.10 Software Integration Plan (SOIP) ............................................................... 29 4.3.11 Software Integration Audit Report (SIAR).................................................. 29

- ii -

4.3.12 Software Installation Plan (SIP) ................................................................. 30 5.0 PRACTICES, STANDARDS, PRACTICES, CONVENTIONS, AND METRICS ................................ 31 AND

5.1 PURPOSE ............................................................................................................. 31 5.2 CONTENT ............................................................................................................. 31 5.2.1 Documentation Standards ............................................................................. 32 5.2.2 Logic Structure Standards ............................................................................ 32 5.2.3 Coding and Commentary Standards .............................................................. 35 5.2.4 Testing Standards and Practices .................................................................. 37 5.2.5 Software Process and Product Metrics......................................................... 39 6.0 REVIEWS AND AUDITS ........................................................................................ 40 AUDITS ........................................................................................ ................................

6.1 PURPOSE ............................................................................................................. 40 6.1.1 Technical and Managerial Reviews and Audits ............................................ 40 6.1.2 Accomplishing Reviews and Audits .............................................................. 41 6.1.3 Implementing and Verifying Reviews and Audits ......................................... 42 6.2 MINIMUM REQUIREMENTS ...................................................................................... 42 6.2.1 Software Requirements Review (SRR) .......................................................... 43 6.2.2 Software Preliminary Design Review (SPDR)................................................ 43 6.2.3 Software Critical Design Review (SCDR) ...................................................... 43 6.2.4 Software Verification and Validation Plan Review (SVVPR) ........................ 44 6.2.5 Functional Configuration Audit (FCA) ........................................................... 44 6.2.6 Physical Configuration Audit (PCA) ............................................................... 45 6.2.7 In-Process Audit ............................................................................................ 45 6.2.8 Managerial Review ........................................................................................ 46 6.2.9 Software Configuration Management Plan Review (SCMPR) ....................... 47 6.2.10 Post Mortem Review................................................................................... 47 6.3 OTHER ................................................................................................................ 48 6.3.1 System/Subsystem Requirements Review (SSRR)........................................ 48 6.3.2 System/Subsystem Design Review (SSDR) ................................................... 48 6.3.3 Software Test Readiness Review (SOTRR)................................................... 49 6.3.4 Software Test Results Review (SOTRER) ..................................................... 49 6.3.5 System Test Readiness Review (SYTRR)...................................................... 50 6.3.6 System Test Results Review (SYTRER) ........................................................ 50 6.3.7 Software Usability Review (SUR) .................................................................. 50 6.3.8 Software Maintenance Review (SMR) ........................................................... 51 7.0 8.0 9.0 TEST..................................................................................................................... TEST..................................................................................................................... 51 ................................................................................................ REPORTING ACTION............................................ ION................................ PROBLEM REPORTING AND CORRECTIVE ACTION............................................ 52 TECHNIQUES, METHODOLOGIES................................ ................................................... TOOLS, TECHNIQUES, AND METHODOLOGIES................................................... 52

................................................................................................... 10.0 CODE CONTROL................................................................................................... 54 CONTROL ................................................................................................ ................................................................................................. ................................................................ 11.0 MEDIA CONTROL ................................................................................................. 54

- iii -

CONTROL ................................................................ ........................................................... 12.0 SUPPLIER CONTROL ........................................................................................... 55 COLLECTION, AND 13.0 RECORDS COLLECTION, MAINTENANCE, AND RETENTION .............................. 56 14.0 TRAINING................................................................................................ ............................................................................................................. 14.0 TRAINING ............................................................................................................. 56 ................................................................ ............................................................ 15.0 RISK MANAGEMENT ............................................................................................ 56

- iv -

1.0

INTRODUCTION

This section shall delineate the specific purpose and scope of the particular SQAP. It shall list the name(s) of the software items covered by the SQAP and the intended use of the software. It shall state the portion of the software life cycle covered by the SQAP for each software item specified. 1.1 Purpose

The purpose of the SQAP is to define a planned and systematic pattern of all actions necessary to provide adequate confidence that a software work product conforms to established technical requirements. Specifically, the SQAP defines a set of activities designed to evaluate the software processes by which software work products are developed and/or maintained. 1.2 Scope

The scope of the SQAP includes definition of the SQA organization, tasks, and responsibilities, identification of minimum documentation requirements for software developers and how SQA verifies them, identification of standards, practices, conventions, and metrics for software developers and how SQA verifies them, and identification of reviews and audits. The scope of the SQAP also includes identification of software tests not included in the SVVP, identification of practices and procedures for problem reporting and corrective action, and identification of tools, techniques, and methodologies for SQA. In addition, the scope of the SQAP includes identification of the code control, media control, supplier control, and records collection, maintenance, and retention policies and procedures from software configuration management. Finally, the scope of the SQAP includes identification of SQA training requirements and the risk management methods and procedures to be used by the software project manager. 1.3 Software Items

The software items covered by the SQAP include the operating system CSCI, data acquisition CSCI, data management CSCI, and data processing CSCI of the command and control system. The command and control system enables the high-speed collection, storage, and postprocessing of real-time telemetry data from specialized data measurement equipment. • Operating System CSCI: The operating system CSCI provides the integrating framework for the other three CSCIs, the data acquisition CSCI, data management CSCI, and data processing CSCI. The operating system CSCI provides key integrating functions such as the human-computer user interface, caution and warning, status messaging and logging,

-1-

automatic command and control system execution, special test scenario execution, manual control, system startup and shutdown, system initialization, and system debugging. • Data Acquisition CSCI: The data acquisition CSCI provides key functions such as a real-time interface to the specialized data measurement equipment, high-speed data collection, datarate configuration, data-size configuration, built-in-test, initialization, shutdown, and an automated command interface to the data management CSCI and operating system CSCI. • Data Management CSCI: The data management CSCI provides key functions such as a realtime interface to the data acquisition CSCI, high-speed data storage, archiving, and retrieval, automatic data-rate detection, automatic data-size detection, built-in-test, initialization, shutdown, and automated interfaces to the data processing CSCI and operating system CSCI. • Data Processing CSCI: The data processing CSCI provides key functions such as real-time and non-real-time data processing, high-speed data reduction and analysis, data-rate detection, automatic data-size detection, built-in-test, initialization, shutdown, and automated interfaces to the data management CSCI and operating system CSCI. The SQAP in its entirety applies to the command and control system and its four CSCIs. The management, documentation, standards, practices, conventions, and metrics, reviews and audits, test, problem reporting and corrective action, tools, techniques, and methodologies, code control, media control, supplier control, records collection, maintenance, and retention, training, and risk management requirements of the SQAP apply to the command and control system software. 1.4 Software Life Cycle

The software life cycle to which the SQAP applies for all CSCIs is defined by IEEE 12207. The software life cycle is the period of time that begins when a software product is conceived and ends when the software is no longer available for use. More specifically, the software life cycle is a collection of interrelated activities or software processes for managing and developing software-based products and services. The software life cycle phases to which the SQAP applies include system requirements analysis, system architectural design, software requirements analysis, software architectural design, software detailed design, software coding and testing, software integration, software qualification testing, system integration, system qualification testing, software installation, and software acceptance support. 2.0 REFERENCE DOCUMENTS

This section shall provide a complete list of documents referenced elsewhere in the text of the SQAP.

-2-

The reference documents which the SQAP is principally based upon consist of three documents, the IEEE Standard for Software Quality Assurance Plans, the IEEE Standard for Reviews and Audits, and the IEEE Standard for Software Life Cycle Processes. • ANSI/IEEE STD 730.1-1989 (IEEE Standard for Software Quality Assurance Plans): The purpose of this standard is to provide uniform, minimum acceptable requirements for preparation and content of Software Quality Assurance Plans (SQAPs). • ANSI/IEEE STD 1028-1988 (IEEE Standard for Software Reviews and Audits): The purpose of this standard is to provide definitions and uniform requirements for review and audit processes. • IEEE/EIA 12207.0-1996 (IEEE Standard for Software Life Cycle Processes): The purpose of this standard is to provide uniform, minimum acceptable requirements for software activities, software products, software technical reviews, software records, and software joint reviews. • ANSI/IEEE STD 1012-1986 (IEEE Standard for Software Verification and Validation Plans): This purpose of this standard is to provide uniform and minimum requirements for the format and content of SVVPs, define minimum V&V tasks, and suggest optional V&V tasks. • DI-IPSC-81433-941205 (MIL-STD-498 Software User Manual Data Item Description): The purpose of this DID is to tell a hands-on software user how to install and use a CSCI, a group of related CSCIs, or a software system or subsystem. It may also cover a particular aspect of software operation, such as instructions for a particular position or task. • ANSI/IEEE STD 828-1990 (IEEE Standard for Software Configuration Management Plans): The purpose of this standard is to establish the minimum required contents of SCM plans and activities which include the identification and establishment of baselines, the review, approval, and control of changes, the tracking and reporting of such changes, the audits and reviews of the evolving software product, and the control of interface documenation and project supplier SCM. • ANSI/IEEE STD 1058.1-1987 (IEEE Standard for Software Project Management Plans): The purpose of this standard is to prescribe the format and content of software project management plans, which serve as controlling documents for managing software projects. • OMG Version 1.3-June 1999 (OMG Unified Modeling Language Specification): The purpose of this standard is to serve as a precise and self-consistent definition of UML semantics and notation. UML is a graphically and visually oriented diagramming standard for representing analytical models of software requirements and software designs. • SPC-94093-CMC Version 01.00.10-October 1995 (Ada 95 Quality and Style: Guidelines for Professional Programmers): The purpose of this document is to provide software source code style and coding guidelines for the Ada 95 computer programming language, including requirements for source code presentation, readability, program structure, programming

-3-

practices, concurrency, portability, reusability object-oriented features, and improving performance. • Cannon, L.W., Elliot, R.A., Kirchhoff, L.W., Miller, J.H., Milner, J.M., Mitze, R.W., Schan, E.P., Wittington, N.O., Spencer, H. Keppel, D. and Brader, M., Revision 6.0, 25-June-1990 (Indian Hill C Style and Coding Standards): The purpose of this document is to provide software source code style and coding guidelines for the C computer programming language, including requirements for file organization, comments, declarations, function declarations, whitespace, simple statements, compound statements, operators, naming conventions, constants, macros, conditional compilation, debugging, portability, ANSI C, special considerations, lint, make, and project-dependent standards. • Gabryelski, K, Wildfire Communications, Inc., 1997 (Wildfire C++ Programming Style: With Rationale): The purpose of this document is to provide software source code style and coding guidelines for the C++ computer programming language, including requirements for files, preprocessor, identifier naming conventions, using white space, types, variables, functions, statements, miscellaneous, and interaction with C. Files include file naming conventions, file organization, header file content, and source file content. • Patrick, T. Prentice Hall, 2000 (Visual Basic Style Guide): The purpose of this document is to provide software source code style and coding guidelines for the Visual Basic computer programming language, including requirements for declaration standards, keyword reference, control and user interface standards, and database standards. • BL, T., W3C, 1998 (W3C Style Guide for Online Hypertext): The purpose of this document is to provide software source code style and coding guidelines for the HTML computer programming language, including requirements for markup tags, character formatting, linking, inline images, tables, and fill-out forms. • Sun Microsystems, Inc., 20-APR-99 (Sun Code Conventions for the Java Programming Language): The purpose of this document is to provide software source code style and coding guidelines for the Java computer programming language, including requirements for file names, file organization, indentation, comments, declarations, statements, white space, naming conventions, and programming practices. • DoD and US Army Version 4.0b-October 2000 (PSM Practical Software and Systems Measurement: A Foundation for Objective Project Management): The purpose of this document is to introduce software process and product measurement guidelines for managing system and software projects, to include broad classes of software measures, guidelines for application, and practical examples. • MIL-STD-1521B-4 June 1985 (Military Standard for Technical Reviews and Audits for Systems, Equipments, and Computer Software): The purpose of this standard is to prescribe the requirements for the conduct of technical reviews and audits on systems, equipments, and

-4-

computer software, and has been designed to take advantage of current technological advancement and management procedures in conducting reviews and audits. 3.0 MANAGEMENT

This section shall describe organization, tasks, and responsibilities. 3.1 Organization

This paragraph shall depict the organizational structure that influences and controls the quality of the software. This shall include a description of each major element of the organization together with the delegated responsibilities. Organizational dependence or independence of the elements responsible for SQA from those responsible for software development and use shall be clearly described or depicted. 3.1.1 Organizational Structure

The organizational structure to which the SQAP applies consists of software engineering, software testing, SCM, and, more specifically, SQA itself. 3.1.2 Organizational Description

The organizational description to which the SQAP applies consists of software engineering which is responsible for software development, software testing which is responsible for evaluating the software, SCM which is responsible for controlling software baselines, and SQA which is responsible for evaluating the software engineering, software testing, and SCM processes. • Software Engineering: Software engineering is the collection of individuals (both managers and technical staff) who have responsibility for software development and maintenance activities (i.e., requirements analysis, design, code, and test) for a project. Groups performing software-related work, such as the software quality assurance group, the software configuration management group, and the software engineering process group, are not included in the software engineering group. • Software Testing: Software testing is a process of dynamically operating, exercising, executing, and evaluating CSCIs to ensure that they meets their software requirements, by the application of software test plans, software test designs, software test cases, software test procedures, and test reports.

-5-

• SCM: SCM is a discipline applying technical and administrative direction and surveillance to identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verify compliance with specified requirements. • SQA: SQA is defined as a (1) A planned and systematic pattern of all actions necessary to provide adequate confidence that a software work product conforms to established technical requirements. (2) A set of activities designed to evaluate the process by which software work products are developed and/or maintained. 3.1.3 Organizational Independence

The organizational independence of SQA consists of a mutually exclusive chain of authority, responsibility, functional organization, and reporting channels between software engineering, software testing, SCM, and especially SQA. Primarily, software engineering reports to a software project lead and the software engineering functional manager. SQA is not functionally subordinate to software engineering, the software project lead, or the software engineering functional manager, and thus maintains independent power, status, and authority from software engineering in order to maintain independence, objectivity, and integrity of SQA activities. Furthermore, SQA does not report to the system project or program manager in order to further propagate the integrity of SQA independence and protect SQA software process evaluation activities and results from the cost, quality, schedule, and delivery pressures of software projects. 3.2 Tasks

This paragraph shall describe (a) that portion of the software life cycle covered by the SQAP, (b) the tasks to be performed with special emphasis on software quality assurance activities, and (c) the relationships between these tasks and the planned major check-points. The sequence of the tasks shall be indicated. 3.2.1 Software Life Cycle

The software life cycle phases to which the SQAP applies include system requirements analysis, system architectural design, software requirements analysis, software architectural design, software detailed design, software coding and testing, software integration, software qualification testing, system integration, system qualification testing, software installation, and software acceptance support. • System Requirements Analysis Phase: System requirements analysis is the process of developing system-level requirements, for computer software configuration items (CSCI) of a system or segment of a system, for later use by system architectural design.

-6-

Software Activity

System System Software Software Requirements Architectural Requirements Architectural Analysis Design Analysis Design

Software Detailed Design

Software Coding and Testing

Software Integration

Software Qualification Testing

System Integration

System Qualification Testing

Software Installation

Software Acceptance Support

Software Product

SRS

SARAD

SRD • UDD

DDD (p) SAD • SIDD (p) • TVPL • UDD (u)
• •

DDD (d) SDD • SIDD (d) • TVPL (u) • UDD (u)
• •

DDD (u) TVPL (u) • TVPR • UDD (u) • TVRR
• •

SOIP TVPR (u) • UDD (u) • TVRR
• •

UDD (u) SIAR • TVRR
• •

• •

TVPR (u) TVRR

TVRR

SIP

TVRR

Technical Review

• Walkthru

• Inspection

• Walkthru

• Inspection

• Walkthru

• Inspection

• Walkthru

• Inspection

• Walkthru

• Inspection

• Walkthru

• Inspection

• Walkthru

• Inspection

• Walkthru

• Inspection

• Walkthru

• Inspection

• Walkthru

• Inspection

Software Record

SYRER

SYAER

SORER

SOAER

DDER

EOCR SCTRER • SCR
• •

SIER

• •

DER SCR

SQTER

SCR SER • SQTARR
• •

SIRR

SCR

Joint Review

System/ Subsystem Requirements Review

System/ Subsystem Design Review

Software Requirements Review

Software Preliminary Design Review

Software Critical Design Review

Software Test Readiness Review

Software Test Results Review

System Test Readiness Review

System Test Results Review

Software Usability Review

Software Maintenance Review

PLAN (3) SIP SOIP TVPL Software Installation Plan Software Integration Plan Test or Validation Plan SPECIFICATION (1) SRS System Requirements Specification DESCRIPTION (7) DDD SAD SARAD SDD SIDD SRD UDD Database Design Description Software Architecture Description System Architecture and Requirements Allocation Description Software Design Description Software Interface Design Description Software Requirements Description User Documentation Description PROCEDURE (1) TVPR Test or Validation Procedures SIAR TVRR

REPORT (2) Software Integration Audit Report Test or Validation Results Report RECORD (14) DDER DER EOCR SCR SCTRER SER SIER SIRR SOAER SORER SQTARR SQTER SYAER SYRER Detailed Design Evaluation Record Documentation Evaluation Record Executable Object Code Record Source Code Record Software Code and Test Results Evaluation Record System Evaluation Record Software Integration Evaluation Record Software Installation Results Record Software Architecture Evaluation Record Software Requirements Evaluation Record System Qualification Test Audit Results Record System Qualification Test Evaluation Record System Architecture Evaluation Record System Requirements Evaluation Record (p)—preliminary, (d)—detailed, (u)—updated

• System Architectural Design Phase: System architectural design is the process of transforming the system-level requirements into an architectural design, for a system or segment of a system, including its operational and support environments, for later use by software requirements analysis. • Software Requirements Analysis Phase: Software requirements analysis is the process of developing software requirements, for a CSCI of a system or segment of a system, for later use by software architectural design. • Software Architectural Design Phase: Software architectural design is the process of transforming software requirements into a top-level software design consisting of computer software components (CSC), for a CSCI of a system or segment of a system, for later use by software detailed design.

-7-

• Software Detailed Design Phase: Software detailed design is the process of decomposing the software architectural design into an increasingly detailed hierarchy of computer software units (CSU), for a CSCI of a system or segment of a system, for later use by software coding and unit testing. • Software Coding and Testing Phase: Software coding and testing is the process of transforming the software detailed design—CSUs—into computer software, for a CSCI of a system or segment of a system, for later use by software integration. • Software Integration Phase: Software integration is the process of combining and evaluating the CSUs that have been implemented and unit tested, for a CSCI of a system or segment of a system, for later use by software qualification testing. • Software Qualification Testing Phase: Software qualification testing is the process of dynamically evaluating computer software using test cases and test procedures based on CSCI-level software requirements, for CSCIs of a system or segment of a system, for later use by system integration. • System Integration Phase: System integration is the process of combining and evaluating CSCIs and HWCIs of a system or segment of a system, that have undergone individual software and hardware qualification testing, for later use by system qualification testing. • System Qualification Testing Phase: System qualification testing is the process of dynamically evaluating integrated CSCIs and HWCIs of a system or segment of a system, using test cases and test procedures based on system-level requirements, for later use by software installation. • Software Installation Phase: Software installation is the process of transporting and installing software associated with a system or a segment of a system from the development environment to the target environment, using installation policies, plans, procedures, and work instructions, for later use by software acceptance support. • Software Acceptance Support Phase: Software acceptance support is the process of assisting customers and end-users dynamically evaluate a system or segment of a system, using acceptance test plans, test cases, and test procedures, in order to determine to whether or not to accept the system from the developer. 3.2.2 SQA Activities

The SQA activities principally consist of auditing the software activities, software products, technical reviews, and software records of the software life cycle phases for conformance to software process and software product standards. There are SQA activities for each of the twelve software life cycle phases, including system requirements analysis, system architectural design, software requirements analysis, software architectural design, software detailed design, software

-8-

coding and testing, software integration, software qualification testing, system integration, system qualification testing, software installation, and software acceptance support. • System Requirements Analysis Phase: The SQA activities for the system requirements analysis phase include auditing the system requirements analysis activities, SRS, walkthroughs of the SRS, and inspections of the SRS for conformance to the system requirements analysis activity standard, SRS document standard, walkthrough standard, and inspection standard. • System Architectural Design Phase: The SQA activities for the system architectural design phase include auditing the system architectural design activities, SARAD, walkthroughs of the SARAD, and inspections of the SARAD for conformance to the system architectural design activity standard, SARAD document standard, walkthrough standard, and inspection standard. • Software Requirements Analysis Phase: The SQA activities for the software requirements analysis phase include auditing the software requirements analysis activities, SRD, UDD, walkthroughs of the SRD and UDD, and inspections of the SRD and UDD for conformance to the software requirements analysis activity standard, SRD and UDD document standards, walkthrough standard, and inspection standard. • Software Architectural Design Phase: The SQA activities for the software architectural design phase include auditing the software architectural design activities, DDD (p), SAD, SID (p), TVPL, and UDD (u), walkthroughs of the DDD (p), SAD, SID (p), TVPL, and UDD (u), and inspections of the DDD (p), SAD, SID (p), TVPL, and UDD (u) for conformance to the software architectural design activity standard, DDD (p), SAD, SID (p), TVPL, and UDD (u) document standards, walkthrough standard, and inspection standard. • Software Detailed Design Phase: The SQA activities for the software detailed design phase include auditing the software detailed design activities, DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u), walkthroughs of the DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u), and inspections of the DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u) for conformance to the software detailed design activity standard, DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u) document standards, walkthrough standard, and inspection standard. • Software Coding and Testing Phase: The SQA activities for the software coding and testing phase include auditing the software coding and testing activities, DDD (u), TVPL (u), TVPR, UDD (u), and TVRR, walkthroughs of the DDD (u), TVPL (u), TVPR, UDD (u), and TVRR, and inspections of the DDD (u), TVPL (u), TVPR, UDD (u), and TVRR for conformance to the software coding and testing activity standard, DDD (u), TVPL (u), TVPR, UDD (u), and TVRR document standards, walkthrough standard, and inspection standard. • Software Integration Phase: The SQA activities for the software integration phase include auditing the software integration activities, SOIP, TVPR (u), UDD (u), and TVRR, walkthroughs of the SOIP, TVPR (u), UDD (u), and TVRR, and inspections of the SOIP,

-9-

TVPR (u), UDD (u), and TVRR for conformance to the software integration activity standard, SOIP, TVPR (u), UDD (u), and TVRR document standards, walkthrough standard, and inspection standard. • Software Qualification Testing Phase: The SQA activities for the software qualification testing phase include auditing the software qualification testing activities, UDD (u), SIAR, and TVRR, walkthroughs of the UDD (u), SIAR, and TVRR, and inspections of the UDD (u), SIAR, and TVRR for conformance to the software qualification testing activity standard, UDD (u), SIAR, and TVRR document standards, walkthrough standard, and inspection standard. • System Integration Phase: The SQA activities for the system integration phase include auditing the system integration activities, TVPR (u) and TVRR, walkthroughs of the TVPR (u) and TVRR, and inspections of the TVPR (u) and TVRR for conformance to the system integration activity standard, TVPR (u) and TVRR document standards, walkthrough standard, and inspection standard. • System Qualification Testing Phase: The SQA activities for the system qualification testing phase include auditing the system qualification testing activities, TVRR, walkthroughs of the TVRR, and inspections of the TVRR for conformance to the system integration activity standard, TVRR document standard, walkthrough standard, and inspection standard. • Software Installation Phase: The SQA activities for the software installation phase include auditing the software installation activities, SIP, walkthroughs of the SIP, and inspections of the SIP for conformance to the system integration activity standard, SIP document standard, walkthrough standard, and inspection standard. • Software Acceptance Support Phase: The SQA activities for the software acceptance support phase include auditing the software acceptance support activities, TVRR, walkthroughs of the TVRR, and inspections of the TVRR for conformance to the system integration activity standard, TVRR document standard, walkthrough standard, and inspection standard. 3.2.3 Milestones

The milestones which follow the SQA activities include the system/subsystem requirements review, system/subsystem design review, software requirements review, software preliminary design review, software critical design review, software test readiness review, software test results review, system test readiness review, system test results review, software usability review, and the software maintenance review. • System/Subsystem Requirements Review (SSRR): External review techniques include a system/subsystem requirements review (SSRR), which immediately follows the system requirements analysis phase. SQA audits of system requirements analysis activities, the SRS, SRS walkthroughs, and SRS inspections shall occur before SSRR commences.

- 10 -

• System/Subsystem Design Review (SSDR): External review techniques include a system/subsystem design review (SSDR), which is necessary to successfully conclude the system architectural design phase. SQA audits of system architectural design activities, the SARAD, SARAD walkthroughs, and SARAD inspections shall occur before SSDR commences. • Software Requirements Review (SRR): External review techniques include a software requirements review (SRR), which immediately follows the software requirements analysis phase. SQA audits of software requirements analysis activities, the SRD and UDD, SRD and UDD walkthroughs, and SRD and UDD inspections shall occur before SRR commences. • Software Preliminary Design Review (SPDR): External review techniques include a software preliminary design review (SPDR), which immediately follows the software architectural design phase. SQA audits of software architectural design activities, the DDD (p), SAD, SID (p), TVPL, and UDD (u), DDD (p), SAD, SID (p), TVPL, and UDD (u) walkthroughs, and DDD (p), SAD, SID (p), TVPL, and UDD (u) inspections shall occur before SPDR commences. • Software Critical Design Review (SCDR): External review techniques include a software critical design review (SCDR), which immediately follows the software detailed design phase. SQA audits of software detailed design activities, the DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u), DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u) walkthroughs, and DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u) inspections shall occur before SCDR commences. • Software Test Readiness Review (SOTRR): External review techniques include a software test readiness review (SOTRR), which immediately follows the software integration phase. SQA audits of software coding and testing activities and software integration activities, the DDD (u), TVPL (u), TVPR, UDD (u), TVRR, SOIP, and TVPR (u), DDD (u), TVPL (u), TVPR, UDD (u), TVRR, SOIP, and TVPR (u) walkthroughs, and DDD (u), TVPL (u), TVPR, UDD (u), TVRR, SOIP, and TVPR (u) inspections shall occur before SOTRR commences. • Software Test Results Review (SOTRER): External review techniques include a software test results review (SOTRER), which immediately follows the software qualification testing phase. SQA audits of software qualification testing activities, the UDD (u), SIAR, and TVRR, UDD (u), SIAR, and TVRR walkthroughs, and UDD (u), SIAR, and TVRR inspections shall occur before SOTRER. • System Test Readiness Review (SYTRR): External review techniques include a SYTRR, which immediately follows the system integration phase. SQA audits of system integration activities, the TVPR (u) and TVRR, TVPR (u) and TVRR walkthroughs, and TVPR (u) and TVRR inspections shall occur before SYTRR.

- 11 -

• System Test Results Review (SYTRER): External review techniques include a system test results review (SYTRER), which immediately follows the system qualification testing phase. SQA audits of the system qualification testing activities, the TVRR, TVRR walkthroughs, and TVRR inspections shall occur before SYTRR. • Software Usability Review (SUR): External review techniques include a software usability review (SUR), which immediately follows the software installation phase. SQA audits of the software installation activities, the SIP, SIP walkthroughs, and SIP inspections shall occur before SUR. • Software Maintenance Review (SMR): External review techniques include a software usability review (SUR), which immediately follows the software installation phase. SQA audits of the software acceptance support activities, the TVRR, TVRR walkthroughs, and TVRR inspections shall occur before SMR. 3.3 Responsibilities

This paragraph shall identify the specific organizational elements responsible for each task. The responsibilities of SQA shall include auditing the software processes and software products of the software life cycle for conformance to software process and software product standards. SQA shall audit the software processes, which include the software activities themselves, walkthroughs of the software work products, and inspections of the software work products for conformance to software activity standards, walkthrough standards, and inspection standards. SQA shall audit the software products, which include each of the 31 software work products resulting from each of the twelve software activities for conformance to software work product standards. 3.3.1 Software Activities

The responsibilities of SQA shall include auditing the software activities for each of the twelve software life cycle phases for conformance to software activity standards. SQA shall audit the system requirements analysis, system architectural design, software requirements analysis, software architectural design, software detailed design, software coding and testing, software integration, software qualification testing, system integration, system qualification testing, software installation, and software acceptance support activities. 3.3.2 Software Work Products

The responsibilities of SQA shall include auditing the software work products for each of the twelve software life cycle phases for conformance to software work product standards. SQA shall audit the SRS resulting from the system requirements analysis activity. SQA shall audit the

- 12 -

SARAD resulting from the system architectural design activity. SQA shall audit the SRD and UDD resulting from the software requirements analysis activity. SQA shall audit the DDD (p), SAD, SIDD (p), TVPL, and UDD (u) resulting from the software architectural design activity. SQA shall audit the DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u) resulting from the software detailed design activity. SQA shall audit the DDD (u), TVPL (u), TVPR, UDD (u), and TVRR resulting from the software coding and testing activity. SQA shall audit the SOIP, TVPR (u), UDD (u), and TVRR resulting from the software integration activity. SQA shall audit the UDD (u), SIAR, and TVRR resulting from the software qualification testing activity. SQA shall audit the TVPR (u) and TVRR of the system integration activity. SQA shall audit the TVRR of the system qualification testing activity. SQA shall audit the SIP of the software installation activity. And, SQA shall audit the TVRR of the software acceptance support activity. 3.3.3 Walkthroughs Software Work Products

The responsibilities of SQA shall include auditing walkthroughs of software work products for each of the twelve software life cycle phases for conformance to walkthrough standards. SQA shall audit walkthroughs of the SRS resulting from the system requirements analysis activity. SQA shall audit walkthroughs of the SARAD resulting from the system architectural design activity. SQA shall audit walkthroughs of the SRD and UDD resulting from the software requirements analysis activity. SQA shall audit walkthroughs of the DDD (p), SAD, SIDD (p), TVPL, and UDD (u) resulting from the software architectural design activity. SQA shall audit walkthroughs of the DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u) resulting from the software detailed design activity. SQA shall audit walkthroughs of the DDD (u), TVPL (u), TVPR, UDD (u), and TVRR resulting from the software coding and testing activity. SQA shall audit walkthroughs of the SOIP, TVPR (u), UDD (u), and TVRR resulting from the software integration activity. SQA shall audit walkthroughs of the UDD (u), SIAR, and TVRR resulting from the software qualification testing activity. SQA shall audit walkthroughs of the TVPR (u) and TVRR of the system integration activity. SQA shall audit walkthroughs of the TVRR of the system qualification testing activity. SQA shall audit walkthroughs of the SIP of the software installation activity. And, SQA shall audit the TVRR of the software acceptance support activity. 3.3.4 Inspections of Software Work Products

The responsibilities of SQA shall include auditing inspections of software work products for each of the twelve software life cycle phases for conformance to inspection standards. SQA shall audit inspections of the SRS resulting from the system requirements analysis activity. SQA shall audit inspections of the SARAD resulting from the system architectural design activity. SQA shall audit inspections of the SRD and UDD resulting from the software requirements analysis activity. SQA shall audit inspections of the DDD (p), SAD, SIDD (p), TVPL, and UDD (u) resulting from the software architectural design activity. SQA shall audit inspections of the DDD (d), SDD, SIDD (d), TVPL (u), and UDD (u) resulting from the software detailed design activity. SQA shall audit inspections of the DDD (u), TVPL (u), TVPR, UDD (u), and TVRR resulting

- 13 -

from the software coding and testing activity. SQA shall audit inspections of the SOIP, TVPR (u), UDD (u), and TVRR resulting from the software integration activity. SQA shall audit inspections of the UDD (u), SIAR, and TVRR resulting from the software qualification testing activity. SQA shall audit inspections of the TVPR (u) and TVRR of the system integration activity. SQA shall audit inspections of the TVRR of the system qualification testing activity. SQA shall audit inspections of the SIP of the software installation activity. And, SQA shall audit the TVRR of the software acceptance support activity. 4.0 4.1 DOCUMENTATION Purpose

This section shall perform the following functions: (1) Identify the documentation governing the development, verification and validation, use, and maintenance of the software. (2) State how the documents are to be checked for adequacy. This shall include the criteria and the identification of the review or audit by which the adequacy of each document shall be confirmed, with reference to Section 6 of the SQAP. 4.2 Minimum Documentation Requirements

To ensure that the implementation of the software satisfies requirements, the following documentation is required as a minimum: 4.2.1 Software Requirements Document (SRD)

The SRD shall clearly and precisely describe each of the essential requirements (functions, performances, design constraints, and attributes) of the software and the external interfaces. Each requirement shall be defined such that its achievement is capable of being objectively verified and validated by a prescribed method; for example, inspection, analysis, demonstration, or test. The purpose of the software requirements description is to specify the requirements for a software item and the methods to be used to ensure that each requirement has been met. The software requirements description is used as the basis for design and qualification testing of a software item. SQA shall conduct an audit of the SRD to verify the following properties:

- 14 -

• Generic description information. • System identification and overview. • Functionality of the software item. • Performance requirements. • Physical characteristics. • Environmental conditions. • Requirements for interfaces external to software item. • Qualification requirements. • Safety specifications, including those related to methods of operation and maintenance, environmental influences, and personnel injury. • Security and privacy specifications, including those related to compromise of sensitive information. • Human-factors engineering (ergonomics) requirements. • Manual operations. • Human-equipment interactions. • Constraints on personnel. • Areas that need concentrated human attention and are sensitive to human errors and training. • Data definition and database requirements, including installation-dependent data for adaptation needs. • Installation and acceptance requirements of the delivered software product at the operation site(s). • Installation and acceptance requirements of the delivered software product at the maintenance site(s). • User documentation requirements. • User operation and execution requirements. • User maintenance requirements. • Software quality characteristics. • Design and implementation constraints.

- 15 -

• Computer resource requirements. • Packaging requirements. • Precedence and criticality of requirements. • Requirements traceability. • Rationale. 4.2.2 Software Architecture Description (SAD)

The SAD shall depict how the software will be structured to satisfy the requirements in the SRD. The SAD shall describe the components and subcomponents of the software design, including data bases and internal interfaces. The SAD shall be prepared first as the Preliminary SAD (also referred to as the Top-Level SAD) and shall be subsequently expanded to produce the Detailed SDD. The purpose of the software architecture description is to describe the software item-wide design decisions and the software item architectural design. SQA shall conduct an audit of the SAD to verify the following properties: • Generic description information. • System overview and identification. • Software item architectural design. • Software architecture general description. • Software component definition. • Identification of software requirements allocated to each software component. • Software component concept of execution. • Resource limitations and the strategy for managing each resource and its limitation. • Rationale for software architecture and component definition decisions, including database and user interface design. 4.2.3 Software Verification and Validation Plan (SVVP)

The SVVP shall identify and describe the methods (for example, inspection, analysis,

- 16 -

demonstration, or test) to be used: (1) To verify that (a) the requirements in the SRS have been approved by an appropriate authority, (b) the requirements in the SRS are implemented in the design expressed in the SDD; and (c) the design expressed in the SDD is implemented in the code. (2) To validate that the code, when executed, complies with the requirements expressed in the SRS. The purpose of the software verification and validation plan is to provide, for both critical and non-critical software, uniform and minimum requirements for the format and content of SVVPs, define, for critical software, specific minimum V&V tasks and their required inputs and outputs that shall be included in SVVPs, and suggest optional V&V tasks to be used to tailor SVVPs as appropriate for the particular V&V effort. SQA shall conduct an audit of the SVVP to verify the following properties: • Purpose. • Referenced documents. • Definitions. • Verification and validation overview. • Organization. • Master schedule. • Resources summary. • Responsibilities. • Tools, techniques, and methodologies. • Life-cycle verification and validation. • Management of V&V. • Concept phase V&V. • Requirements phase V&V. • Design phase V&V. • Implementation phase V&V. • Test phase V&V.

- 17 -

• Installation and checkout phase V&V. • Operation and maintenance phase V&V. • Software verification and validation reporting. • Required reports. • Optional reports. • Verification and validation administrative procedures. • Anomaly reporting and resolution. • Task iteration policy. • Deviation policy. • Control procedures. • Standards, practices, and conventions. 4.2.4 Software Verification and Validation Report (SVVR)

The SVVR shall describe the results of the execution of the SVVP. The purpose of the software verification and validation report is to summarize the results of V&V tasks performed in each of the software life cycle phases, system requirements analysis, system architectural design, software requirements analysis, software architectural design, software detailed design, software coding and testing, software integration, software qualification testing, system integration, system qualification testing, software installation, and software acceptance support. SQA shall conduct an audit of the SVVR to verify the following properties: • Task reporting. • Interim results and status. • V&V phase summary report. • Description of V&V tasks performed. • Summary of task results. • Summary of anomalies and resolution.

- 18 -

• Assessment of software quality. • Recomendations. • Anomaly report. • Description and location. • Impact. • Cause. • Criticality. • Recommendations. • V&V final report. • Summary of all life-cycle V&V tasks. • Summary of task results. • Summary of anomalies and resolutions. • Assessment of overall software quality. • Recommendations. • Special studies report. • Purpose and objectives. • Approach. • Summary. • Other reports. • Software quality assurance results. • Software testing results. • Software configuration management results. 4.2.5 User Documentation Description (UDD)

User documentation (e.g., manual, guide, etc.) shall specify and describe the required data and control inputs, input sequences, options, program limitations, and other activities or items necessary for successful execution of the software. All error messages shall be identified and

- 19 -

corrective actions described. A method of describing user-identified errors or problems to the developer or the owner of the software shall be described. (Embedded software that has no direct user interaction has no need for user documentation and is therefore exempted from this requirement.) The purpose of the user documentation description is to record the planning and engineering information created during the development process that is of use to the users of the software product or service. SQA shall conduct an audit of the UDD to verify the following properties: • Scope. • Identification. • System overview. • Document overview. • Referenced documents. • Software summary. • Software application. • Software inventory. • Software environment. • Software organization and overview of operation. • Contingencies and alternate states and modes of operation. • Security and privacy. • Assistance and problem reporting. • Access to the software. • First-time user of the software. • Equipment familiarization. • Access control. • Installation and setup. • Initiating a session.

- 20 -

• Stopping and suspending work. • Processing reference guide. • Capabilities. • Conventions. • Processing procedures. • (Aspect of software use). • Related processing. • Data backup. • Recovery from errors, malfunctions, and emergencies. • Messages. • Quick-reference guide. • Notes. • Appendices. 4.2.6 Software Configuration Management Plan (SCMP)

The SCMP shall document methods to be used for identifying software items, controlling and implementing changes, and recording and reporting change implementation status. The purpose of the software configuration management plan is to provide a structure for identifying and controlling software documentation, software source code, software interfaces, and databases to support all software life cycle phases, support the software development and maintenance methodology that fits the software requirements, standards, policies, organization and management philosophy, and support production of management and product information concerning the status of software baselines, change control, tests, releases, and audits. SQA shall conduct an audit of the SCMP to verify the following properties: • Introduction. • SCM management. • Configuration identification. • Identifying configuration items.

- 21 -

• Naming configuration items. • Acquiring configuration items. • Configuration control. • Requesting changes. • Evaluating changes. • Approving or disapproving changes. • Implementing changes. • Configuration status accounting. • Configuration audits and reviews. • Interface control. • Subcontractor/vendor control. • SCM schedules. • SCM resources. • SCM plan maintenance. 4.3 Other

Other documentation may include the following: (1) Software Development Plan (2) Standards and Procedures Manual (3) Software Project Management Plan (4) Software Maintenance Manual. 4.3.1 Software Project Plan (SPP)

The purpose of the software project plan is to serve as a controlling document for managing a software project. A software project plan defines the technical and managerial project functions, activities, and tasks necessary to satisfy the requirements of a software project, as defined in the project agreement. SQA shall conduct an audit of the SPP to verify the following properties:

- 22 -

• Generic plan information for managing the project. • Project organizational structure showing authority and responsibility of each organizational unit, including external organizations. • Engineering environment (for development, operation or maintenance, as applicable), including test environment, library, equipment, facilities, standards, procedures, and tools. • Work breakdown structure of the life cycle processes and activities, including the software products, software services and non-deliverable items to be performed, budgets, staffing, physical resources, software size, and schedules associated with the tasks. • Management of the quality characteristics of the software products or services (separate plans for quality may be developed). • Management of safety, security, privacy, and other critical requirements of the software products or services (separate plans for safety and security may be developed). • Subcontractor management, including subcontractor selection and involvement between the subcontractor and the acquirer, if any. • Quality assurance. • Verification and validation, including the approach for interfacing with the verification and validation agent, if specified. • Acquirer involvement (i.e., joint reviews, audits, informal meetings, reporting, modification and change, implementation, approval, acceptance, access to facilities). • User involvement (i.e., requirements setting exercises, prototype demonstrations and evaluations). • Risk management (i.e., the management of the areas of the project that involve technical, cost, and schedule risks). • Security policy (i.e., the rules for need-to-know and access-to-information at each project organizational level). • Approval required by such means as regulations, required certifications, proprietary, usage, ownership, warranty and licensing rights. • Means for scheduling, tracking, and reporting. • Training of personnel. • Software life cycle model. • Configuration management (separate plans for configuration management may be developed).

- 23 -

4.3.2

System Requirements Specification (SRS)

The purpose of the system requirements specification is to specify the requirements for a system or subsystem and the methods to be used to ensure that each requirement has been met. The system requirements specification is used as the basis for design and qualification testing of a system or subsystem. SQA shall conduct an audit of the SRS to verify the following properties: • Generic specification information. • System identification and overview. • Required states and modes. • Requirements for the functions and performance of the system. • Business, organizational, and user requirements. • Safety, security, and privacy protection requirements. • Human-factors engineering (ergonomics) requirements. • Operations and maintenance requirements. • System external interface requirements. • System environmental requirements. • Design constraints and qualification requirements. • Computer resource requirements. • Computer hardware requirements. • Computer hardware resource requirements, including utilization requirements. • Computer software requirements. • Computer communications requirements. • System quality characteristics. • Internal data requirements. • Installation-dependent data requirements. • Physical requirements. • Personnel, training, and logistics requirements.
- 24 -

• Packaging requirements. • Precedence and criticality of requirements. • Rationale. 4.3.3 System Architecture and Requirements Allocation Description (SARAD)

The purpose of the system architecture and requirements allocation description is to describe the architectural design of a system or subsystem. SQA shall conduct an audit of the SARAD to verify the following properties: • Generic description information. • System overview and identification. • Hardware item identification. • Software item identification. • Manual operations identification. • Concept of execution. • Rationale for allocation of hardware items, software items, and manual operations. 4.3.4 Database Design Description (DDD)

The purpose of the database design description is to describe the design of a database, that is, a collection of related data stored in one or more computerized files in a manner that can be accessed by users or computer programs via a database management system. The database design description may also describe the software units used to access or manipulate the data. The database design description is used as the basis for implementing the database and related software units. SQA shall conduct an audit of the DDD to verify the following properties: • Generic description information. • Database overview and identification. • Design of the database, including descriptions of applicable design levels (e.g., conceptual, internal, logical, physical). • Reference to design description of software used for database access or manipulation.

- 25 -

• Rationale for database design. 4.3.5 Software Interface Design Description (SIDD)

The purpose of the software interface design description is to describe the interface characteristics of one or more system, subsystem, hardware item, software item, manual operation, or other system component. The software interface design description may describe any number of interfaces. SQA shall conduct an audit of the SIDD to verify the following properties: • Generic description information. • External interface identification. • Software component identification. • Software unit identification. • External-software item interface definition (e.g., source language, diagrams). • Software item-software item interface definition (e.g., source language, diagrams). • Software component-software component interface definition (e.g., source language, diagrams). 4.3.6 Test or Validation Plan (TVPL)

The purpose of the test or validation plan is to describe plans for qualification testing of software items and software systems. The test or validation plan describes the software test environment to be used for the testing, identify the tests to be performed, and provide schedules for test activities. SQA shall conduct an audit of the TVPL to verify the following properties: • Generic plan information. • Test levels. • Test classes. • General test conditions. • Test progression. • Data recording, reduction, and analysis.

- 26 -

• Test coverage (breadth and depth) or other methods for assuring sufficiency of testing. • Planned tests, including items and their identifiers. • Test schedules. • Requirements traceability. • Qualification testing environment, site, personnel, and participating organizations. 4.3.7 Software Design Description (SDD)

The purpose of the software design description is to describe the design of a software item. The software design description and the software architecture provide the detailed design needed to implement the software. The software design description may be supplemented by software item interface design and database design. SQA shall conduct an audit of the SDD to verify the following properties: • Generic description information. • Description of how the software item satisfies the software requirements, including algorithms and data structures. • Software item input/output description. • Static relationships of software units. • Concept of execution, including data flow and control flow. • Requirements traceability. • Software component-level requirements traceability. • Software unit-level requirements traceability. • Rationale for software item design. • Reuse element identification. 4.3.8 Test or Validation Procedures (TVPR)

The purpose of the test or validation procedures is to describe the test preparations, test cases, and test procedures to be used to perform qualification testing of a software item or a software system or subsystem. The test or validation procedures enable the acquirer to assess the adequacy of the qualification testing to be performed.

- 27 -

SQA shall conduct an audit of the TVPR to verify the following properties: • Test or Validation Procedures. • Generic procedure information. • Identification of test author. • Identification of test configuration. • Test objectives, requirements, and rationale. • Test preparations (hardware, software, other) for each test. • Test descriptions. • Test identifier. • Requirements addressed. • Prerequisite conditions. • Test input. • Expected test results. • Criteria for evaluating results. • Instructions for conducting procedure. • Requirements traceability. • Rationale for decisions. 4.3.9 Test or Validation Results Report (TVRR)

The purpose of the test or validation results report is to provide a record of the qualification testing performed on a software item, a software system or subsystem, or other software-related item. The test or validation results report enables the acquirer to assess the testing and its results. SQA shall conduct an audit of the TVRR to verify the following properties: • Generic report information. • System identification and overview. • Overview of test results. • Overall assessment of the software tested.

- 28 -

• Impact of test environment. • Detailed test results. • Test identifier. • Test summary. • Problems encountered. • Deviations from test cases/procedures. • Test log. • Rationale for decisions. 4.3.10 Software Integration Plan (SOIP) The purpose of the software integration plan is to define the activities necessary to integrate the software units and software components into the software item. SQA shall conduct an audit of the SOIP to verify the following properties: • Generic plan information. • Test requirements. • Test procedures. • Test data. • Test responsibilities. • Test schedule. 4.3.11 Software Integration Audit Report (SIAR) The purpose of the software integration audit report is to describe the results of an independent audit of software qualification testing activities and work products. SQA shall conduct an audit of the SIAR to verify the following properties: • Date of issue and status. • Scope. • Issuing organization.

- 29 -

• References. • Summary. • Introduction. • Context. • Message. • Contributors. • Body. • Conclusions and recommendations. • Bibliography. • Glossary. • Change history. 4.3.12 Software Installation Plan (SIP) The purpose of the software installation plan is to describe the information necessary to install a system or component, set initial parameters, and prepare the system or component for operational use. SQA shall conduct an audit of the SIP to verify the following properties: • Scope. • Identification. • System overview. • Document overview. • Relationship to other plans. • Referenced documents. • Installation overview. • Description. • Contact point. • Support materials.

- 30 -

• Training. • Tasks. • Personnel. • Security and privacy protection. • Site-specific information for software center operations staff. • (Site name). • Schedule. • Software inventory. • Facilities. • Installation team. • Installation procedures. • Data update procedures. • Site-specific information for software users. • (Site name). • Schedule. • Installation procedures. • Data update procedures. 5.0 5.1 STANDARDS, PRACTICES, CONVENTIONS, AND METRICS Purpose

This section shall: (1) Identify the standards, practices, conventions and metrics to be applied. (2) State how compliance with these items is to be monitored and assured. 5.2 Content

The subjects covered shall include the basic technical, design, and programming activities

- 31 -

involved, such as documentation, variable and module naming, programming, inspection, and testing. As a minimum, the following information shall be provided: (1) Documentation standards (2) Logic structure standards (3) Coding standards (4) Commentary standards (5) Testing standards and practices (6) Selected software quality assurance product and process metrics such as: (a) Branch metric (b) Decision point metric (c) Domain metric (d) Error message metric (e) Requirements demonstration metric 5.2.1 Documentation Standards

The documentation standards that shall be enforced by the SQAP are the IEEE Standard for Software Life Cycle Processes, IEEE Standard for Software Verification and Validation Plans, MIL-STD-498 Software User Manual Data Item Description, IEEE Standard for Software Configuration Management Plans, and the IEEE Standard for Software Project Management Plans. Only the following documentation standards from the IEEE Standard for Software Life Cycle Processes shall be enforced by the SQAP, the SRS, SARAD, SRD, UDD, DDD, SAD, SIDD, TVPL, SDD, TVPR, TVRR, SOIP, SIAR, and SIP. 5.2.2 Logic Structure Standards

The logic structure standard that shall be enforced by the SQAP is the OMG Unified Modeling Language. The following nine UML logic structure diagrams shall be enforced by the SQAP, class diagram, object diagram, use case diagram, sequence diagram, collaboration diagram, statechart diagram, activity diagram, component diagram, and deployment diagram. • Class Diagram: A class diagram is a graph of classifier elements connected by their various static relationships. Note that a “class” diagram may also contain interfaces, packages, relationships, and even instances, such as objects and links. Perhaps a better name would be “static structural diagram” but “class diagram” is shorter and well established. A class diagram is a graphic view of the static structural model. The individual class diagrams do not represent divisions in the underlying model. A class diagram is a collection of (static) declarative model elements, such as classes, interfaces, and their relationships, connected as a graph to each other and to their contents. Class diagrams may be organized into packages
- 32 -

either with their underlying models or as separate packages that build upon the underlying model packages. • Object Diagram: An object diagram is a graph of instances, including objects and data values. A static object diagram is an instance of a class diagram; it shows a snapshot of the detailed state of a system at a point in time. The use of object diagrams is fairly limited. Mainly to show examples of data structures. Tools need not support a separate format for object diagrams. Class diagrams can contain objects, so a class diagram with objects and no classes is an “object diagram.” The phrase is useful, however, to characterize a particular usage achievable in various ways. A class represents a concept within the system being modeled. Classes have data structure and behavior and relationships to other elements. The name of a class has scope within the package in which it is declared and the name must be unique (among class names) within its package. A class is drawn as a solid-outline rectangle with three compartments separated by horizontal lines. The top name compartment holds a list of attributes; the bottom list compartment holds a list of operations. • Use Case Diagram: A use case diagram shows the relationship among actors and use cases within a system. Use case diagrams show actors and use cases together with their relationships. The use cases represent functionality of a system or a classifier, like a subsystems or a class, as manifested to external interactors with the system or the classifier. A use case diagram is a graph of actors, a set of use cases, possibly some interfaces, an the relationships between these elements. The relationships are associations between the actors and the use cases, generalizations between the actors, and generalizations, extends, and includes among the use cases. The use cases may optionally be enclosed by a rectangle that represents the boundary of the containing system or classifier. A use case is a kind of classifier representing a coherent unit of functionality provide by a system, a subsystem, or a class as manifested by sequences of messages exchanged among the system and one or more outside interactors (called actors) together with actions performed by the system. A use case is show as an ellipse containing the name of the use case. An optional stereotype keyword may be placed above the name and a list of properties included below the name. As a classifier, a use case may also have compartments displaying attributes and operations. • Sequence Diagram: A sequence diagram presents an interaction, which is a set of messages between classifier roles within a collaboration to effect a desired operation or result. A sequence diagram has two dimensions: 1) the vertical dimension represents time and 2) the horizontal dimension represents different objects. Normally time proceeds down the page. (The dimensions may be reversed, if desired.) Usually only time sequences are important, but in real-time applications the time axis could be an actual metric. There is no significance to the horizontal ordering of the objects. Objects can be grouped into “swimlanes” on a diagram. (See subsequent sections for details of the contents of a sequence diagram.) • Collaboration Diagram: A collaboration diagram presents a collaboration, which contains a set or roles to played by objects, as well as their required relationships given in a particular context. The diagram may also present an interaction, which defines a set of messages

- 33 -

specifying the interaction between the objects playing the roles within a collaboration to achieve the desired result. A collaboration is used for describing the realization of an operation or a classifier. A collaboration which describes a classifier, like a use case, references classifiers and associations in general, while a collaboration describing an operation includes the arguments and local variables of the operation, as well as ordinary associations attached to the classifier owning the operation. A collaboration diagram shows a graph of either objects linked to each other, or classifier roles and association roles; it may also include the communication stated by an interaction. A collaboration diagram can be given in two different forms: at instance level or at specification level; it may either show instances, links, and stimuli, or show classifier roles, association roles, and messages. • Statechart Diagram: A statechart diagram can be used to describe the behavior of a model element such as an objet or an interaction. Specifically, it describes possible sequences of states and actions through which the element can proceed during its lifetime as a result of reacting to discrete events (e.g., signals, operations invocations). Statechart diagrams represent the behavior of entities capable of dynamic behavior by specifying its response to the receipt of event instances. Typically, it is used for describing the behavior of classes, but statecharts may also describe the behavior of other model entities such as use cases, actors, subsystems, operations, or methods. A statechart diagram is a graph that represents a state machine. States and various other types of vertices (pseudostates) in the state machine graph are rendered by appropriate state and pseudostate symbols, while transitions are generally rendered by directed arcs that interconnect them. States may also contain sub-diagrams by physical containment or tiling. Note that every state machine has a top state which contains all the other elements of the entire state machine. The graphical rendering of this top state is optional. • Activity Diagram: An activity graph is a variation of a sate machine in which the states represent the performance of actions or subactivities and the transitions are triggered by the completion of the actions or subactivities. It represents a state machine of a procedure itself. An activity diagram is a special case of a state diagram in which all (or at least most) of the states are action or subactivity states and in which all (or at least most) of the transitions are triggered by completion of the actions or subactivities in the source states. The entire activity diagram is attached (through the model) to a class, such as a use case, or to a package, or to the implementation of an operation. The purpose of this diagram is to focus on flows driven by internal processing (as opposed to external events). Use activity diagrams in situations where all or most of the events represent fhe completion of internally-generated actions (that is, procedural flow of control). Use ordinary state diagrams in situations where asynchronous events occur. • Component Diagram: A component diagram shows the dependencies among software components, including source code components, binary code components, and executable components. For a business, “software” components are taken in the broad sense to include business procedures and documents. A software module may be represented as a component stereotype. Some components exist at compile time, some exist at link time, some exist at run

- 34 -

time, and some exist at more than one time. A compile-only component is one that is only meaningful at compile time. The run-time component in this case would be an executable program. A component diagram has only a type form, not an instance form. To show component instances, use a deployment diagram (possibly a degenerate one without nodes). A component diagram is a graph of components connected by dependency relationships. Components may also be connected to components by physical containment representing composition relationships. A diagram containing component types and node types may be used to show static dependencies, such as compiler dependencies between programs, which are show as dashed arrows (dependencies) from a client component to a supplier component that it depends on in some way. The kinds of dependencies are implementation-specific and may be shown as stereotypes of the dependencies. As a classifier, a component may have operations and may realize interfaces. The diagram may show these interfaces and calling dependencies among components, using dashed arrows from components to interfaces on other components. • Deployment Diagram: Deployment diagrams show the configuration of run-time processing elements and the software components, processes, and objects that live on them. Software component instances represent run-time manifestations of code units. Components that do not exist as run-time entities (because they have been compiled away) do not appear on these diagrams, they should be show on component diagrams. For business modeling, the run-time processing elements include workers and organizational units, and the software components include procedures and documents used by the workers and organizational units. A deployment diagram is a graph of nodes connected by communication associations. Nodes may contain component instances. This indicates that the component lives or runs on the node. Components may contain objects, this indicates that the object resides on the component. Components are connected to other components by dashed-arrow dependencies (possible through interfaces). This indicates that one component uses the services of another component. A stereotype may be used to indicate the precise dependency, if needed. The deployment type diagram may also be used to show which components may reside on which nodes, by using dashed arrows with the stereotype support from the component symbol to the node symbol or by graphically nesting the component symbol within the node symbol. 5.2.3 Coding and Commentary Standards

The coding standards that shall be enforced by the SQAP include the SPC Ada 95 Quality and Style, Indian Hill C Style and Coding Standards, Wildfire C++ Programming Style, Visual Basic Style Guide, W3C Style Guide for Online Hypertext, and the Sun Code Conventions for the Java Programming Language. • SPC Ada 95 Quality and Style: The SPC Ada 95 Quality and Style includes requirements for source code presentation, readability, program structure, programming practices, concurrency, portability, reusability object-oriented features, and improving performance. Source code presentation includes code formatting. Readability includes spelling, naming conventions,

- 35 -

comments, and using types. Program structure includes high-level structure, visibility, and exceptions. Programming practices include optional parts of the syntax, parameter lists, types, data structures, expressions, statements, visibility, using exceptions, and erroneous execution and bounded errors. Concurrency includes concurrency options, communication, and termination. Portability includes fundamentals, numeric types and expressions, storage control, tasking, exceptions, representation clauses and implementation-dependent features, and input/output. Reusability includes understanding and clarity, robustness, adaptability, and independence. Object-oriented features include object-oriented design, tagged type hierarchies, tagged type operations, managing visibility, and multiple inheritance. Improving performance includes performance issues, performance measurement, program structure, data structures, algorithms, types, and pragmas. • Indian Hill C Style and Coding Standards: The Indian Hill C Style and Coding Standards include requirements for file organization, comments, declarations, function declarations, whitespace, simple statements, compound statements, operators, naming conventions, constants, macros, conditional compilation, debugging, portability, ANSI C, special considerations, lint, make, and project-dependent standards. • Wildfire C++ Programming Style: The Wildfire C++ Programming Style includes requirements for files, preprocessor, identifier naming conventions, using white space, types, variables, functions, statements, miscellaneous, and interaction with C. Files include file naming conventions, file organization, header file content, and source file content. Preprocessor includes macros and conditional compilation. Identifier naming conventions include general rules, identifier style, namespace clashes, and reserved namespaces. Using white space includes indentation, long lines, comments, block comments, single-line comments, and trailing comments. Types include constants, use of const, struct and union declarations, enum declarations, classes, class declarations, class constructors and destructors, automatically-provided member functions, function overloading, operator overloading, protected items, friends, friend classes, friend methods, and templates. Variables include placement of declarations, extern declaration, indentation of variables, number of variables per line, definitions hiding other definitions, and initialized variables. Functions include function declarations and function definitions. Statements include compound statements, if/else statements, for statements, do statements, while statements, infinite loops, empty loops, switch statements, goto statements, return statements, and try/catch statements. Miscellaneous includes general comments and rules, limits on numeric precision, comparing against zero, boolean, character, integral, floating point, pointer, use and misuse of inline, references versus pointers, and portability. Interaction with C includes ANSI-C/C++ include files, including C++ header files in C programs, including C header files in C++, and C code calling C++ libraries. • Visual Basic Style Guide: The Visual Basic Style Guide includes requirements for declaration standards, keyword reference, control and user interface standards, and database standards. Declaration standards include nomenclature standards, nomenclature for variables, nomenclature for constants, nomenclature for user-defined types, nomenclature for

- 36 -

enumerated data types, nomenclature for line labels, nomenclature for procedures, nomenclature for declares, nomenclature for user interface elements, nomenclature exceptions, instantiation standards, instantiation of variables, instantiation of constants, instantiation of user-defined types, instantiation of enumerated data types, instantiation of line lables, instantiation of procedures, instantiation of declares, declaration modifiers, global options, compiler directives, Visual Basic limitation on declaration, and data typing of literals. Keyword reference includes compiler directives, conversion functions, date and time features, declaration features, error handling and debugging features, file system features, financial features, flow control features, math features, miscellaneous features, operators, and string features. Control and user interface standards includes general considerations, communication, control interaction, documentation, and specific control information. Database standards include database design, nomenclature, normalization, database documentation, database usage, spreadsheet presentation, bound filed presentation, and form object presentation. • W3C Style Guide for Online Hypertext: The W3C Style Guide for Online Hypertext includes requirements for markup tags, character formatting, linking, inline images, tables, and fill-out forms. Markup tags include html, head, title, body, headings, paragraphs, lists, preformatted text, extended quotations, addresses, forced line breaks/postal addresses, and horizontal rules. Character formatting includes logical versus physical styles and escape sequences. Linking includes relative pathnames versus absolute pathnames, URLs, links to specific sections, and mailto. Inline images include image size attributes, aligning images, alternate text for images, background graphics, background color, and external images, sounds, and animations. Tables include table tags, general table format, and tables for nontabular information. • Sun Code Conventions for the Java Programming Language: The Sun Code Conventions for the Java Programming Language includes requirements for file names, file organization, indentation, comments, declarations, statements, white space, naming conventions, and programming practices. File names include file suffixes and common file names. File organization includes Java source files, beginning comments, package and import statements, and class and interface declarations. Indentation includes line length and wrapping lines. Comments include implementation comment formats, block comments, single-line comments, trailing comments, end-of-line comments, and documentation comments. Declarations include number per line, initialization, placement, and class and interface declarations. Statements include simple statements, compound statements, return statements, if, if-else, if else-if else statements, for statements, while statements, do-while statements, switch statements, and trycatch statements. White space includes blank lines and blank spaces. Programming practices include providing access to instance and class variables, referring to class variables and methods, constants, variable assignments, miscellaneous practices, parentheses, returning values, expressions before ‘?’ in the conditional operator, and special comments. 5.2.4 Testing Standards and Practices

- 37 -

The testing standards and practices that shall be enforced by the SQAP are from the IEEE Standard for Software Life Cycle Processes. The following software activity standards from the IEEE Standard for Software Life Cycle Processes shall be enforced by the SQAP, software coding and testing, software integration, software qualification testing, system integration, system qualification testing, and software acceptance support. • Software Coding and Testing Phase: Software coding and testing is the process of transforming the software detailed design—CSUs—into computer software, for a CSCI of a system or segment of a system, for later use by software integration. • Software Integration Phase: Software integration is the process of combining and evaluating the CSUs that have been implemented and unit tested, for a CSCI of a system or segment of a system, for later use by software qualification testing. • Software Qualification Testing Phase: Software qualification testing is the process of dynamically evaluating computer software using test cases and test procedures based on CSCI-level software requirements, for CSCIs of a system or segment of a system, for later use by system integration. • System Integration Phase: System integration is the process of combining and evaluating CSCIs and HWCIs of a system or segment of a system, that have undergone individual software and hardware qualification testing, for later use by system qualification testing. • System Qualification Testing Phase: System qualification testing is the process of dynamically evaluating integrated CSCIs and HWCIs of a system or segment of a system, using test cases and test procedures based on system-level requirements, for later use by software installation. • Software Acceptance Support Phase: Software acceptance support is the process of assisting customers and end-users dynamically evaluate a system or segment of a system, using acceptance test plans, test cases, and test procedures, in order to determine to whether or not to accept the system from the developer. The following documentation standards from the IEEE Standard for Software Life Cycle Processes shall be enforced by the SQAP, the TVPL, TVPR, and the TVRR. • Test or Validation Plan (TVPL): The purpose of the test or validation plan is to describe plans for qualification testing of software items and software systems. The test or validation plan describes the software test environment to be used for the testing, identify the tests to be performed, and provide schedules for test activities. • Test or Validation Procedures (TVPR): The purpose of the test or validation procedures is to describe the test preparations, test cases, and test procedures to be used to perform qualification testing of a software item or a software system or subsystem. The test or

- 38 -

validation procedures enable the acquirer to assess the adequacy of the qualification testing to be performed. • Test or Validation Results Report (TVRR): The purpose of the test or validation results report is to provide a record of the qualification testing performed on a software item, a software system or subsystem, or other software-related item. The test or validation results report enables the acquirer to assess the testing and its results. 5.2.5 Software Process and Product Metrics

The software process and product metrics that shall be enforced by the SQAP are defined by the PSM Practical Software and Systems Measurement guide. Only six software process and product metrics have been selected from the PSM Practical Software and Systems Measurement Guide, and shall include software size (process), software effort (process), software cost (process), software productivity (process), software cycle time (process), and software quality (product). • Software Size (process): Physical size and stability measures quantify the physical size of a system or product. Size is a critical factor for estimating development schedules and costs. These measures also provide information on the amount and frequency of change to products, which is especially critical late in product development. The lines of code measure counts the total amount of source code and the amount that has been added, modified, or deleted. Lines of code is a well-understood software measure that helps in estimating project cost, required effort, schedule, and productivity. Changes in the number of lines of code indicate development risk due to product size volatility, and possible additional work. • Software Effort (process): Effort refers to develop effort—the effort required to design, code, unit test, and system test, measured in person-months. The effort measure counts the number of labor hours or number of personnel applied to all tasks. This is a straightforward, easily understood measure. It can be categorized by activity as well as by product. This measure usually correlates directly with cost, but can also address other common issue areas including schedule and progress, and process performance. • Software Cost (process): The cost measure counts budgeted and expended costs. The measure provides information about the amount of money spent on a project or a product, compared to budgets. • Software Productivity (process): Productivity is the number of lines of source code produced per programmer-month (person-month) of effort. The productivity measure compares the amount of product completed to the amount of effort expended. This measure is a basic input to project planning and can evaluate whether performance levels are sufficient to meet cost and schedule estimates. Productivity is also useful early in the project for estimate and baseline comparisons before actual productivity data is available. • Software Cycle Time (process): Cycle time or duration is defined as the elapsed time in hours or months during which development effort proceeds without interruption. Cycle time
- 39 -

measures the length of time that it takes a process to complete all associate activities. The accumulation of all processes determines the total schedule to complete a project. Usually, a key objective in process improvement is to reduce overall cycle time. • Software Quality (product): Quality or defect density is the number of software defects committ4ed per thousand lines of software source code. The defects measure quantifies the number, status, and priority of defects reported. It provides useful information on the ability of a supplier to find and fix defects in hardware, software, or documentation. The number of defects indicates the amount of rework, and has a direct impact on quality. Arrival rates can indicate product maturity (a decrease should occur as testing is completed). Closure rates are an indication of progress, and can be used to predict test completion. Tracking the length of time that defects have remained open can be use to determine whether progress is being made in fixing defects, or whether rework is being deferred. A defect density measure—an expression of the number of defects in a quantity of product—can be derived from this measure. Defect density can identify components with the highest concentration of defects. 6.0 6.1 REVIEWS AND AUDITS Purpose

This section shall: (1) Define the technical and managerial reviews and audits to be conducted. (2) State how the reviews and audits are to be accomplished. (3) State what further actions are required and how they are to be implemented and verified. The purpose of this section is to identify and define the technical and managerial reviews and audits that shall be enforced by the SQAP. Eighteen technical and managerial reviews and audits shall be enforced by the SQAP as defined by the IEEE Standard for Software Quality Assurance Plans, IEEE 12207, Military Standard for Technical Reviews and Audits for Systems, Equipments, and Computer Software, and the IEEE Standard for Software Reviews and Audits. 6.1.1 Technical and Managerial Reviews and Audits

The first ten technical and managerial reviews and audits are from the IEEE Standard for Software Quality Assurance Plans and the IEEE Standard for Software Reviews and Audits. They include the software requirements review, software preliminary design review, software critical design review, software verification and validation plan review, functional configuration audit, physical configuration audit, in-process audits, managerial reviews, software configuration management plan review, and post mortem review. The next eight reviews are from the IEEE Standard for Software Life Cycle Processes and Military Standard for Technical Reviews and

- 40 -

Audits for Systems, Equipments, and Computer Software. They include the system/subsystem requirements review, system/subsystem design review, software test readiness review, software test results review, system test readiness review, system test results review, software usability review, and software maintenance review. 6.1.2 Accomplishing Reviews and Audits

The reviews and audits will be accomplished by the application of individual policies and procedures for each of the reviews and audits by software project managers, software project personnel, software configuration management personnel, and software quality assurance personnel. Software project managers are responsible for executing the policies and procedures associated with joint reviews. Software configuration management is responsible for executing the policies and procedures associated with functional configuration audits and physical configuration audits. Software quality assurance is directly responsible for executing the policies and procedures of only one of the three types of in-process audits, the audit process itself. Software project personnel, namely software engineers, are responsible for executing the policies and procedures for two of the three types of in-process audits, walkthroughs and inspections. • Software Requirements Review (SRR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SRR. • Software Preliminary Design Review (SPDR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SPDR. • Software Critical Design Review (SCDR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SCDR. • Software Verification and Validation Plan Review (SVVPR): Software project managers, software project personnel, software verification and validation personnel, software quality assurance, and software configuration management are responsible for executing the policies and procedures of the SVVPR. • Functional Configuration Audit (FCA): Software configuration management personnel are responsible for executing the policies and procedures of the FCA. • Physical Configuration Audit (PCA): Software configuration management personnel are responsible for executing the policies and procedures of the PCA. • In-Process Audit: Software quality assurance personnel are responsible for executing the policies and the procedures of the audit process. Software project personnel, namely software engineers, are responsible for executing the policies and procedures of walkthroughs and inspections. • Managerial Review: Software project managers and software project personnel are responsible for executing the policies and procedures of managerial reviews.

- 41 -

• Software Configuration Management Plan Review (SCMPR): Software project managers, software project personnel, software verification and validation personnel, software quality assurance, and software configuration management are responsible for executing the policies and procedures of the SCMPR. • Post Mortem Review: Software project managers, software project personnel, software quality assurance, and software configuration management are responsible for executing the policies and procedures of post mortem reviews. • System/Subsystem Requirements Review (SSRR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SCDR. • System/Subsystem Design Review (SSDR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SSDR. • Software Test Readiness Review (SOTRR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SOTRR. • Software Test Results Review (SOTRER): Software project managers and software project personnel are responsible for executing the policies and procedures of the SOTRER. • System Test Readiness Review (SYTRR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SYTRR. • System Test Results Review (SYTRER): Software project managers and software project personnel are responsible for executing the policies and procedures of the SYTRER. • Software Usability Review (SUR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SUR. • Software Maintenance Review (SMR): Software project managers and software project personnel are responsible for executing the policies and procedures of the SMR. 6.1.3 Implementing and Verifying Reviews and Audits

Implementation and verification of the eighteen major types of reviews and audits shall be accomplished by audits performed by software quality assurance personnel. SQA shall audit each of the eighteen types of reviews and audits using the audit process itself (with the exception of the audit process). Verification of the audit process, as well as audit process effectiveness, shall be independently evaluated (other than by software quality assurance personnel). 6.2 Minimum Requirements

As a minimum, the following reviews and audits shall be conducted:

- 42 -

6.2.1

Software Requirements Review (SRR)

The SRR is held to ensure the adequacy of the requirements stated in the SRS. External review techniques include a software requirements review (SRR), which immediately follows software requirements analysis. The SRR is a review of the finalized CSCI requirements and operational concept. The SRR is conducted when CSCI requirements have been sufficiently defined to evaluate the contractor's responsiveness to and interpretation of the system, subsystem, or prime item level requirements. A successful SRR is predicated upon the contracting agency's determination that the COD, SRD, and UDD form a satisfactory basis for proceeding into software architectural design. 6.2.2 Software Preliminary Design Review (SPDR)

The SPDR (also known as top-level design review) is held to evaluate the technical adequacy of the preliminary design (also known as top-level design) of the software as depicted in the preliminary software design description. External review techniques include a software preliminary design review (SPDR), which immediately follows software architectural design. This review is conducted for each configuration item or aggregate of configuration items to evaluate the progress, technical adequacy, and risk resolution (on a technical, cost, and schedule basis) of the selected design approach. For configuration items, this review determines their compatibility with performance and engineering specialty requirements of the HWCI development specification, and evaluates the degree of definition and assesses the technical risk associated with the selected manufacturing methods/processes. Finally this review establishes the existence and compatibility of the physical and functional interfaces among the configuration items and other items of equipment, facilities, computer software, and personnel. For CSCIs, this review focuses on the evaluation of the progress, consistency, and technical adequacy of the selected top-level design and test approach, compatibility between software requirements and preliminary design, and on the preliminary version of the operation and support documents. 6.2.3 Software Critical Design Review (SCDR)

The SCDR (also known as detailed design review) is held to determine the acceptability of the detailed software designs as depicted in the detailed software design description in satisfying the requirements of the SRD. External review techniques include a software critical design review (SCDR), which immediately follows software detailed design. This review is conducted for each configuration

- 43 -

item when detail design is essentially complete. The purpose of this review is to determine that the detailed design of the configuration item under review satisfies the performance and engineering specialty requirements of the HWCI development specifications. This review also establishes the detailed design compatibility among the configuration items and other items of equipment, facilities, computer software and personnel, assesses configuration item risk areas (on a technical, cost, and schedule basis), and assesses the results of the producibility analyses conducted on system hardware. Finally, the purpose of this review is to review the preliminary hardware product specifications. For CSCIs, this review focuses on the determination of the acceptability of the detailed design, performance, and test characteristics of the design solution, and on the adequacy of the operation and support documents. 6.2.4 Software Verification and Validation Plan Review (SVVPR)

The SVVPR is held to evaluate the adequacy and completeness of the verification and validation methods defined in the SVVP. The objective of the Software Verification and Validation Plan Review (SVVR) shall be to verify the SVVP conforms to software V&V standards, the SVVP meets the needs of the software project, measure compliance with the SVVP, determine the effectiveness of software V&V, and resolve software V&V non-conformances. Verifying the SVVP conforms to software V&V standards consists of conducting audits of the SVVP to ensure that it meets the requirements of the SVVP standard. Verifying the SVVR meets the needs of the software project consists of conducting managerial reviews, walkthroughs, and inspections of the SVVP to ensure that it meets the requirements as stated in software project plans and software requirements documents. Measuring compliance with the SVVP consists of conducting audits of software V&V activities to determine their compliance with policies and procedures. Determining the effectiveness of software V&V consists of analyzing completion of SVVP tasks, compliance levels of software V&V activities, and software quality and reliability levels of the software work products themselves. Resolving software V&V non-conformances consists of identifying, monitoring, and tracking the issues, actions, and non-conformances arising from managerial reviews, walkthroughs, inspections, and audits, and ensuring their rapid resolution and closure. 6.2.5 Functional Configuration Audit (FCA)

This audit is held prior to the software delivery to verify that all requirements specified in the SRS have been met. The objective of the Functional Configuration Audit (FCA) shall be to verify that the configuration item's actual performance complies with its hardware Development or Software Requirements and Interface Requirements Specifications. Test data shall be reviewed to verify that the hardware or computer software performs as required by its functional/ allocated

- 44 -

configuration identification. For configuration items developed at Government expense, an FCA shall be a prerequisite to acceptance of the configuration item. For software, a technical understanding shall be reached on the validity and the degree of completeness of the Software Test Reports, and as appropriate, Computer System Operator's Manual (CSOM), Software User's Manual (SUM), the Computer System Diagnostic Manual (CSDM). 6.2.6 Physical Configuration Audit (PCA)

This audit is held to verify that the software and its documentation are internally consistent and are ready for delivery. The Physical Configuration Audit (PCA) shall be the formal examination of the as-built version of a configuration item against its design documentation in order to establish the product baseline. After successful completion of the audit, all subsequent changes are processed by engineering change action. The PCA also determines that the acceptance testing requirements prescribed by the documentation is adequate for acceptance of production units of a configuration item by quality assurance activities. The PCA includes a detailed audit of engineering drawings, specifications, technical data and tests utilized in production of HWCIs and a detailed audit of design documentation, listings, and manuals for CSCIs. The review shall include an audit of the released engineering documentation and quality control records to make sure the as-build or as-coded configuration is reflected by this documentation. For software, the Software Product Specification and Software Version Description shall be a part of the PCA review. 6.2.7 In-Process Audit

In-process audits of a sample of the design are held to verify consistency of the design, including: (1) Code versus design documentation (2) Interface specifications (hardware and software) (3) Design implementations versus functional requirements (4) Functional requirements versus test descriptions There are three types of in-process audits, software audits, walkthroughs, and inspections. Software audits are independent evaluations of software activities and software work products by software quality assurance, in or order to verify conformance to software process and product standards. Walkthroughs are informal design review meetings held principally by software project managers to elicit comments and feedback on their design solutions. Inspections are expertly facilitated evaluations of software products by domain experts, namely software

- 45 -

engineers, to evaluate their conformance to requirements and identify software defects for mandatory correction. The three types of in-process audits are each unique, while very complementary. SQA audits verify conformance to software process and product standards. Software project manager walkthroughs are open forums for evaluating software designs, and software engineering inspections are expert forums for directly improving software quality. • Software Audit: The objective of software auditing is to provide an objective compliance confirmation of products and processes to certify adherence to standards, guidelines, specifications, and procedures. Audits are performed in accordance with documented plans and procedures. The audit plan establishes a procedure to conduct the audit and for follow-up action on the audit findings. In performing the audit, audit personnel evaluate software elements and the processes for producing them against objective audit criteria, such as contracts, requirements, plans, specifications, or procedures, guidelines, and standards. The results of the audit are documented and are submitted to the management of the audited organization, to the entity initiating the audit, and to any external organizations identified in the audit plan. The report includes a list of the items in noncompliance or other issues for subsequent review and action. When stipulated by the audit plan, recommendations are reported in addition to the audit results. • Walkthrough: A walkthrough is an informal design review meeting in which the manager, supervisor, or technical lead that’s directly responsible for creating or designing a product, verbalizes the intended operational flow, functional flow, and/or rationale and justification for selecting technologies, a technical architecture, a detailed design, or a specific solution to satisfy the product’s requirements or specifications, with other managers, engineers, and technical specialists (in order to defend the design concept, solicit a critique of the approach, or solicit design alternatives). In short, walkthroughs are intended for managers to solicit design alternatives (without any mandatory action on behalf of the manager or product author). • Inspection: An inspection is a highly structured and facilitated meeting in which independent technical experts analyze and examine each of the individual product characteristics one-byone, in order to identify defects, non-conformances to requirements and specifications, nonconformances to standards, non-conformances to numerical tolerances, operational and functional failures, and/or safety hazards, without the presence of managers, supervisors, or technical leads, without any defense from the author or creator of the product, and without any consideration of design alternatives, design critiques, or any subjective improvements to the product’s design by the examiners (in order identify defects for later mandatory correction and enable early validation of the product using internal technical experts before it is delivered). In short, inspections are for technical experts to identify defects that must be corrected (but, not suggest design alternatives or subjective improvements to the product). 6.2.8 Managerial Review
- 46 -

Managerial reviews are held periodically to assess the execution of all of the actions and the items identified in the SQAP. These reviews shall be held by an organizational element independent of the unit being reviewed, or by a qualified third party. This review may require additional changes in the SQAP itself. The objective of the management review is to provide recommendations for the following: (1) Making activities progress according to plan, based on an evaluation of product develop status; (2) Changing project direction nor to identify the need for alternative planning; (3) Maintaining global control of the project through adequate allocation of resources. The management review process can be applied to new development or to maintenance activities. A management review is a formal evaluation of a project level plan or project status relative to that plan by a designated review team. During the review meeting the entire review team examines plans or progress against applicable plans, standards, and guidelines, or both. Each problem areas identified by the review team is recorded. When critical data and information cannot be supplied, then an additional meeting shall be scheduled to complete the management review process. 6.2.9 Software Configuration Management Plan Review (SCMPR)

The SCMPR is held to evaluate the adequacy and completeness of the configuration management methods defined in the SCMP. The objective of the Software Configuration Management Plan Review (SCMPR) shall be to verify the SCMP conforms to SCM standards, the SCMP meets the needs of the software project, measure compliance with the SCMP, determine the effectiveness of SCM, and resolve SCM non-conformances. Verifying the SCMP conforms to SCM standards consists of conducting audits of the SCMP to ensure that it meets the requirements of the SCMP standard. Verifying the SCMP meets the needs of the software project consists of conducting managerial reviews, walkthroughs, and inspections of the SCMP to ensure that it meets the requirements as stated in software project plans and software requirements documents. Measuring compliance with the SCMP consists of conducting audits of SCM activities to determine their compliance with policies and procedures. Determining the effectiveness of SCM consists of analyzing completion of SCMP tasks, compliance levels of SCM activities, and SCM integrity levels of the software work products themselves. Resolving SCM non-conformances consists of identifying, monitoring, and tracking the issues, actions, and non-conformances arising from managerial reviews, walkthroughs, inspections, and audits, and ensuring their rapid resolution and closure. 6.2.10 Post Mortem Review The review is held at the conclusion of the project to assess the development activities

- 47 -

implemented on that project and to provide recommendations for appropriate actions. The objective of the project postmortem review is to formally, objectively, and consistently evaluate the effectiveness of the software project upon its completion, in a highly structured, repeatable, and measurable fashion (in order to ensure that future projects proactively improve their performance). Evaluating the effectiveness of the software project includes evaluating the effectiveness of the software project plan, how well software project objectives were met, the initial accuracy of quantitative estimates (e.g., size, effort, cost, and critical computer resources), schedule accuracy, appropriateness of work products, deliverables, and product quality, appropriateness of processes, activities, and process quality, the appropriate identification and mitigation of software risks, and the allocation of personnel and facility resources (e.g., computers and software engineering tools). Evaluating the effectiveness of the software project also includes evaluating the effectiveness of any necessary replanning and corrective actions, software project management and coordination, intergroup coordination, communication, cooperation, and teamwork, the technical and interpersonal strengths and weaknesses of individuals, teams, and groups, corporate infrastructure support (e.g., purchasing, human resources, information systems, and facilities management), and most importantly the ability of the organization effectively organize and execute similar projects in the future (if at all). 6.3 Other

Other reviews and audits may include the user documentation review (UDR). This review is held to evaluate the adequacy (e.g., completeness, clarity, correctness, and usability) of user documentation. 6.3.1 System/Subsystem Requirements Review (SSRR)

External review techniques include a system/subsystem requirements review (SSRR), which immediately follows system requirements analysis. The objective of the SSRR is to ascertain the adequacy of the contractor’s efforts in defining system requirements. It is conducted when a significant portion of the system functional requirements has been established. SSRRs are inprocess reviews normally conducted during the system conceptual or validation phase. Such reviews may be conducted at any time but normally will be conducted after accomplishment of functional analysis and preliminary requirements allocation. SSRRs are to determine initial direction and progress of the systems engineering management effort and the convergence upon an optimum and complete configuration. This review will not be conducted by S&IS if a system specification is not required or, if required, is provided by the government. 6.3.2 System/Subsystem Design Review (SSDR)

- 48 -

External review techniques include a system/subsystem design review (SSDR), which is necessary to successfully conclude the system architectural design. This review is conducted to evaluate the optimization, correlation, completeness, and risks associated with the allocated technical requirements. Also included is a summary review of the system engineering process, which produced the allocated technical requirements and of the engineering planning for the next phase of effort. Basic manufacturing considerations are reviewed and planning for production engineering in subsequent phases is addressed. This review is conducted when the system definition effort has proceeded to the point where system characteristics are defined and the configuration items are identified. 6.3.3 Software Test Readiness Review (SOTRR)

External review techniques include a software test readiness review (SOTRR), which immediately follows software integration. This review is conducted for each CSCI to determine whether the software test procedures are complete and to assure that the contractor is prepared for formal CSCI testing. Software test procedures are evaluated for compliance with software test plans and descriptions, and for adequacy in accomplishing test requirements. At SOTRR, the contracting agency also reviews the results of informal software testing and any updates to the operation and support documents. A successful SOTRR is predicated on the contracting agency's determination that the software test procedures and informal test results form a satisfactory basis for proceeding into software qualification testing. 6.3.4 Software Test Results Review (SOTRER)

External review techniques include a software test results review (SOTRER), which immediately follows software qualification testing. SOTRERs are held to resolve open issues regarding the results of software qualification testing. The objective of the SOTRER shall be to verify that the actual performance of the configuration items of the system as determined through test comply with the hardware development specification, software requirements and interface requirements specifications, and to identify the test report(s)/data which document results of qualification tests of the configuration items. The point of government certification will be determined by the contracting agency and will depend upon the nature of the program, risk aspects of the particular hardware and software, and contractor progress in successfully verifying the requirements of the configuration items. When feasible, the SOTRER shall be combined with the functional configuration audit at the end of configuration item/subsystem testing, prior to the physical configuration audit. If sufficient test results are not available at the functional configuration audit to insure the configuration items will perform in their system environment, the SOTRER shall be conducted (post physical configuration audit) during system testing whenever the necessary tests have been successfully completed to enable certification of configuration items. For noncombined functional configuration audit/SOTRERs, traceability, correlation, and completeness of the SOTRER shall be maintained with the functional configuration audit and duplication of effort avoided.

- 49 -

6.3.5

System Test Readiness Review (SYTRR)

External review techniques include a system test readiness review (SYTRR), which immediately follows system integration. This review is conducted for each system to determine whether the system test procedures are complete and to assure that the contractor is prepared for formal system testing. System test procedures are evaluated for compliance with system test plans and descriptions, and for adequacy in accomplishing test requirements. At SYTRR, the contracting agency also reviews the results of informal system testing and any updates to the operation and support documents. A successful SYTRR is predicated on the contracting agency's determination that the system test procedures and informal test results form a satisfactory basis for proceeding into system qualification testing. 6.3.6 System Test Results Review (SYTRER)

External review techniques include a system test results review (SYTRER), which immediately follows system qualification testing. SYTRERs are held to resolve open issues regarding the results of system qualification testing. The objective of the SYTRER shall be to verify that the actual performance of the configuration items of the system as determined through test comply with the hardware development specification, system requirements and interface requirements specifications, and to identify the test report(s)/data which document results of qualification tests of the configuration items. The point of government certification will be determined by the contracting agency and will depend upon the nature of the program, risk aspects of the particular hardware and software, and contractor progress in successfully verifying the requirements of the configuration items. When feasible, the SYTRER shall be combined with the functional configuration audit at the end of configuration item/subsystem testing, prior to the physical configuration audit. If sufficient test results are not available at the functional configuration audit to insure the configuration items will perform in their system environment, the SYTRER shall be conducted (post physical configuration audit) during system testing whenever the necessary tests have been successfully completed to enable certification of configuration items. For noncombined functional configuration audit/SYTRERs, traceability, correlation, and completeness of the SYTRER shall be maintained with the functional configuration audit and duplication of effort avoided. 6.3.7 Software Usability Review (SUR)

External review techniques include a software usability review (SUR), which immediately follows software installation. SURs are held to resolve open issues regarding the readiness of the software for installation at user sites, status of training, including “training software products,” if applicable, the user and operator manuals, the software version descriptions, and the status of installation preparations and activities. SURs optionally involve conducting usability inspections, which are aimed at finding usability problems in an existing user interface design, and then using these problems to make recommendations for fixing the problems and improving the usability of

- 50 -

the design. Usability inspections consist of heuristic evaluation (having usability specialists judge whether each dialogue element conforms to established usability principles), guideline reviews (checking the user interface for conformance with a comprehensive list of usability guidelines), pluralistic walkthroughs (meetings where users, developers, and human factors people step through a scenario, discussing usability issues associated with dialogue elements involved in the scenario steps), consistency inspections (evaluating user interface consistency across a family of products by designers from multiple projects), standards inspections (increasing the degree to which a given user interface is similar to the user interfaces of competing products in the marketplace), cognitive walkthroughs (checking to see if the user interface enables intuitive, consistent, correct, and repeatable user operations), formal usability inspections (a software inspection process used to identify defects in user interfaces), and feature inspections (used to verify that individual user interface functions conform to system requirements). 6.3.8 Software Maintenance Review (SMR)

External review techniques include a software maintenance review (SMR), which immediately follows software acceptance support. SMRs are held to resolve open issues regarding the readiness of the software for transition to the maintenance organization, the software product specifications, the software maintenance manuals, the software version descriptions, and the status of transition preparations and activities, including transition of the software engineering environment, if applicable. SMRs are used to determine necessary software maintenance effort, including age since being placed in production, number and type of changes during life, usefulness of the system, types and number of requests received for changes, quality and timeliness of documentation, any existing performance statistics, number of maintainers, their job descriptions, and their actual jobs, the experience level of the maintenance staff, both industry-wide and for the particular application, the rate of turnover and possible reasons for leaving, current written maintenance methods at the systems and program level, actual methods used by programming staff, and tools used to support the maintenance process and how they are used. SMRs are also used to determine the necessary software maintenance process, quantify the software maintenance effort, and develop the software maintenance plan. Finally, SMRs are used to determine the software maintenance requirements, including expected external or regulatory changes to the system, expected internal changes to support new requirements, wish-lists of new functions and features, expected upgrades for performance, adaptability, and connectivity, new lines of business that need to be supported, and new technologies that need to be incorporated. 7.0 TEST

This section shall identify all the tests not included in the SVVP for the software covered by the SQAP and shall state the methods to be used.

- 51 -

Software test methods that shall be enforced by the SQAP, which are not covered by the SVVP, shall be identified and defined by the test or validation plan and the test or validation procedures. The SVVP, per se, is not the principal test plan. So, identification and definition of software testing methods shall be defined in the test or validation plan, as well as the test or validation procedures. • Test or Validation Plan (TVPL): The purpose of the test or validation plan is to describe plans for qualification testing of software items and software systems. The test or validation plan describes the software test environment to be used for the testing, identify the tests to be performed, and provide schedules for test activities. • Test or Validation Procedures (TVPR): The purpose of the test or validation procedures is to describe the test preparations, test cases, and test procedures to be used to perform qualification testing of a software item or a software system or subsystem. The test or validation procedures enable the acquirer to assess the adequacy of the qualification testing to be performed. 8.0 PROBLEM REPORTING AND CORRECTIVE ACTION

This section shall: (1) Describe the practices and procedures to be followed for reporting, tracking, and resolving problems identified in both software items and the software development and maintenance process. (2) State the specific organizational responsibilities concerned with their implementation. The practices and procedures to be followed for reporting, tracking, and resolving problems identified in both software items and the software development and maintenance process, that shall be enforced by the SQAP, shall be identified and defined by the software quality assurance policy and procedure. • Software Quality Assurance Policy and Procedure: This procedure establishes the guidelines by which software quality assurance prepares software quality assurance plans for software projects, software quality assurance participates in creation of software development plans, software quality assurance reviews and audits activities and work products of software projects, and software quality assurance handles deviations and non-compliances to software standards, plans, and procedures by software projects. This procedure shall begin with project system managers ensuring that software quality assurance is present on all software projects and end with independent experts reviewing the methods and frequency that software quality assurance will use to provide feedback to software engineering, software configuration management, and documentation support. 9.0 TOOLS, TECHNIQUES, AND METHODOLOGIES
- 52 -

This section shall identify the special software tools, techniques, and methodologies that support SQA, state their purposes, and describe their use. The special software tools, techniques, and methodologies that support SQA, that shall be enforced by the SQAP, shall include audits, walkthroughs, inspections, defect typing and classification, and software quality modeling. • Software Audit: The objective of software auditing is to provide an objective compliance confirmation of products and processes to certify adherence to standards, guidelines, specifications, and procedures. Audits are performed in accordance with documented plans and procedures. The audit plan establishes a procedure to conduct the audit and for follow-up action on the audit findings. In performing the audit, audit personnel evaluate software elements and the processes for producing them against objective audit criteria, such as contracts, requirements, plans, specifications, or procedures, guidelines, and standards. The results of the audit are documented and are submitted to the management of the audited organization, to the entity initiating the audit, and to any external organizations identified in the audit plan. The report includes a list of the items in noncompliance or other issues for subsequent review and action. When stipulated by the audit plan, recommendations are reported in addition to the audit results. • Walkthrough: A walkthrough is an informal design review meeting in which the manager, supervisor, or technical lead that’s directly responsible for creating or designing a product, verbalizes the intended operational flow, functional flow, and/or rationale and justification for selecting technologies, a technical architecture, a detailed design, or a specific solution to satisfy the product’s requirements or specifications, with other managers, engineers, and technical specialists (in order to defend the design concept, solicit a critique of the approach, or solicit design alternatives). In short, walkthroughs are intended for managers to solicit design alternatives (without any mandatory action on behalf of the manager or product author). • Inspection: An inspection is a highly structured and facilitated meeting in which independent technical experts analyze and examine each of the individual product characteristics one-byone, in order to identify defects, non-conformances to requirements and specifications, nonconformances to standards, non-conformances to numerical tolerances, operational and functional failures, and/or safety hazards, without the presence of managers, supervisors, or technical leads, without any defense from the author or creator of the product, and without any consideration of design alternatives, design critiques, or any subjective improvements to the product’s design by the examiners (in order identify defects for later mandatory correction and enable early validation of the product using internal technical experts before it is delivered). In short, inspections are for technical experts to identify defects that must be corrected (but, not suggest design alternatives or subjective improvements to the product).

- 53 -

• Software Defect Typing and Classification: Software defect typing and classification provides a uniform approach to the classification of anomalies found in software and its documentation. It describes the processing of anomalies discovered during any software life cycle phase, and it provides comprehensive lists of software anomaly classifications and related data items that are helpful to identify and track anomalies. The minimum set of classifications deemed necessary for a complete data-set are indicated as mandatory. More detailed classifications are provided for those projects that require more rigor. • Software Quality Modeling: Software quality or defect density is the number of software defects committ4ed per thousand lines of software source code. The defects measure quantifies the number, status, and priority of defects reported. It provides useful information on the ability of a supplier to find and fix defects in hardware, software, or documentation. The number of defects indicates the amount of rework, and has a direct impact on quality. Arrival rates can indicate product maturity (a decrease should occur as testing is completed). Closure rates are an indication of progress, and can be used to predict test completion. Tracking the length of time that defects have remained open can be use to determine whether progress is being made in fixing defects, or whether rework is being deferred. A defect density measure—an expression of the number of defects in a quantity of product—can be derived from this measure. Defect density can identify components with the highest concentration of defects. 10.0 CODE CONTROL This section shall define the methods and facilities used to maintain, store, secure and document controlled versions of the identified software during all phases of the software life cycle. This may be implemented in conjunction with a computer program library. This may be provided as part of the SCMP. If so, an appropriate reference shall be made thereto. The methods and facilities used to maintain, store, secure and document controlled versions of the identified software during all phases of the software life cycle, that shall be enforced by the SQAP, shall be identified and defined by the software configuration management plan. • Software Configuration Management Plan (SCMP): The purpose of the software configuration management plan is to provide a structure for identifying and controlling software documentation, software source code, software interfaces, and databases to support all software life cycle phases, support the software development and maintenance methodology that fits the software requirements, standards, policies, organization and management philosophy, and support production of management and product information concerning the status of software baselines, change control, tests, releases, and audits. 11.0 MEDIA CONTROL

- 54 -

This section shall state the methods and facilities to be used to (a) identify the media for each computer product and the documentation required to store the media, including the copy and restore process, and (b) protect computer program physical media from unauthorized access or inadvertent damage or degradation during all phases of the software life cycle. This may be provided as a part of the SCMP. If so, an appropriate reference shall be made thereto. The methods and facilities to be used to identify the media for each computer product and the documentation required to store the media, including the copy and restore process, and protect computer program physical media from unauthorized access or inadvertent damage or degradation during all phases of the software life cycle, that shall be enforced by the SQAP, shall be identified and defined by the software configuration management plan. • Software Configuration Management Plan (SCMP): The purpose of the software configuration management plan is to provide a structure for identifying and controlling software documentation, software source code, software interfaces, and databases to support all software life cycle phases, support the software development and maintenance methodology that fits the software requirements, standards, policies, organization and management philosophy, and support production of management and product information concerning the status of software baselines, change control, tests, releases, and audits. 12.0 SUPPLIER CONTROL This section shall state the provisions for assuring that software provided by suppliers meets established requirements. In addition, this section shall state the methods that will be used to assure that the software supplier receives adequate and complete requirements. For previouslydeveloped software, this section shall state the methods to be used to assure the suitability of the product for use with the software items covered by the SQAP. For software that is to be developed, the supplier shall be required to prepare and implement a SQAP in accordance with this standard. This section shall also state the methods to be employed to assure that the developers comply with the requirements of this standard. The provisions for assuring that software provided by suppliers meets established requirements, that shall be enforced by the SQAP, shall be identified and defined by the software subcontract management policy and procedure. • Software Subcontract Management Policy and Procedure: This procedure establishes the guidelines by which subcontract software managers define software work to be subcontracted, subcontract software managers select software subcontractors, subcontract software managers create software subcontract agreements, subcontract software managers track software subcontractors, and subcontract software managers make changes to software subcontract agreements. This procedures shall begin with project system managers ensuring that documented standards and procedures are used for selecting software subcontractors and

- 55 -

managing software subcontracts and end with software quality assurance reviewing and/or auditing acceptance processes for products of software subcontractors. 13.0 RECORDS COLLECTION, MAINTENANCE, AND RETENTION This section shall identify the SQA documentation to be retained, shall state the methods and facilities to be used to assemble, safeguard, and maintain this documentation and shall designate the retention period. The methods and facilities to be used to assemble, safeguard, and maintain the SQA documentation to be retained, that shall be enforced by the SQAP, shall be identified and defined by the software configuration management plan. • Software Configuration Management Plan (SCMP): The purpose of the software configuration management plan is to provide a structure for identifying and controlling software documentation, software source code, software interfaces, and databases to support all software life cycle phases, support the software development and maintenance methodology that fits the software requirements, standards, policies, organization and management philosophy, and support production of management and product information concerning the status of software baselines, change control, tests, releases, and audits. 14.0 TRAINING This section shall identify the training activities necessary to meet the needs of the SQAP. The training activities necessary to meet the needs of the SQAP, that shall be enforced by the SQAP, shall be identified and defined by the training program policy and procedure. • Training Management Policy and Procedure: This procedure establishes the guidelines by which project software managers develop and maintain a training plan for each software project, training groups develop and revise the organization training plan, training groups perform training for the organization and software projects, training groups develop and maintain training courses, and training groups maintain records of training for the organization and software projects. This procedure shall begin with senior management ensuring that skills and knowledge for software management and technical roles are identified and end with independent experts verifying that training groups follow the organization training plan. 15.0 RISK MANAGEMENT This section shall specify the methods and procedures employed to identify, assess, monitor,
- 56 -

and control areas of risk arising during the portion of the software life cycle covered by the SQAP. The methods and procedures employed to identify, assess, monitor, and control areas of risk arising during the portion of the software life cycle, that shall be enforced by the SQAP, shall be identified and defined by the software project plan. • Software Project Plan (SPP): The purpose of the software project plan is to serve as a controlling document for managing a software project. A software project plan defines the technical and managerial project functions, activities, and tasks necessary to satisfy the requirements of a software project, as defined in the project agreement.

- 57 -

Sign up to vote on this title
UsefulNot useful