You are on page 1of 50

A p p e n d i x D 01

02
03
04
05
06
Sample 07
08
09
Test Plan 10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39 SHO
40 REG
41 LON

1
2 Appendix D Sample Test Plan

01 WALL S TREET F INANCIAL T RADING S YSTEM


02
03
Delivery 2
04
05
06
07
08
09
10
11
12
Test Plan
13 (Date)
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
PREPARED FOR:
29
30 FINANCIAL TRADEWINDS CORPORATION
31 City, State
32
33
34
35
36 PREPARED BY:
37
38 AUTOMATION SERVICES INCORPORATED (AMSI)
ORT 39 Street Address
REG 40 City, State
ONG 41
01
02
03
04
05
06
Contents 07
08
09
10
D.1 Introduction 493
11
12
D.1.1 Purpose 493
13
D.1.2 Background 493
14
D.1.3 System Overview 494 15
D.1.4 Applicable Documents 496 16
D.1.5 Master Schedule 498 17
D.2 Roles and Responsibilities 499 18
D.2.1 Project Organization 499 19
20
D.2.2 Project Roles and Responsibilities 500
21
D.2.3 Test Task Structure 503
22
D.2.4 Test Team Resources 509 23
D.3 Test Program 509 24
D.3.1 Scope 509 25
D.3.2 Test Approach 512 26
D.3.3 Test Strategies 515 27
28
D.3.4 Automated Tools 518
29
D.3.5 Qualification Methods 519
30
D.3.6 Test Requirements 519
31
D.3.7 Test Design 520 32
D.3.8 Test Development 524 33
D.4 Test Environment 526 34
D.4.1 Test Environment Configuration 526 35
36
D.4.2 Test Data 527
37
D.5 Test Execution 529 38
D.5.1 Test Program Reporting 529 39 SHO
D.5.2 Test Program Metrics 529 40 REG
41 LON

3
4 Appendix D Sample Test Plan

01 D.5.3 Defect Tracking 530


02 D.5.4 Configuration Management 532
03
D.6 Detailed Test Schedule 532
04
05 Appendixes
06 D.A Test Procedure Development Guidelines 534
07 D.B Test Verification Summary Matrix 536
08
09 D.C Test Procedures and Test Scripts 538
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
ORT 39
REG 40
ONG 41
D.1 Introduction 5

D.1 Introduction 01
02
D.1.1 Purpose
03
This test plan will outline and define the strategy and approach taken to perform 04
testing on the WallStreet Financial Trading System (WFTS) project. It is intended 05
for use by WFTS project personnel in understanding and carrying out prescribed 06
test activities and in managing these activities through successful completion. This 07
document defines the details of test responsibilities and activities and describes the 08
tests to be conducted. 09
This test plan has been developed to fulfill the following objectives: 10
11
• To lay out the management and technical effort necessary to support testing
12
throughout the system development life cycle
13
• To establish a comprehensive test plan that identifies the nature and extent 14
of tests deemed necessary to achieve the testing objectives for the WFTS 15
project, including software and hardware requirements. 16
• To coordinate an orderly schedule of events, identify equipment and orga- 17
nizational requirements, describe test methodologies and strategies to be 18
used, and identify items to be delivered 19
• To provide a plan that outlines the contents of detailed test procedure 20
scripts and the execution of those test procedure scripts (that is, which test- 21
ing techniques will be used) 22
23
To help standardize the test effort and make it more efficient, test procedure 24
development guidelines are provided in Appendix D.A. These guidelines have been 25
adopted and are being implemented by the AMSI test team for the WFTS project. 26
The test team will take advantage of testing tools to help improve and streamline the 27
testing process. For further detail on the test strategy, see Section D.3.3 of this plan. 28
Test procedures are identified and tracked using the Dynamic Object-Oriented 29
Requirements Management System (DOORS) requirements management tool. 30
This approach will allow for easy management of test progress status. Once a test is 31
performed, the test procedure status is revised within DOORS to reflect actual test 32
results, such as pass/fail. Appendix D.B provides a test verification summary matrix 33
that is generated using DOORS; it links the test procedures to test requirements so 34
as to measure test coverage. Test procedures and test scripts supporting system 35
acceptance test (SAT) are provided in Appendix D.C. 36
37
D.1.2 Background 38
39 SHO
The WFTS project was initiated in response to management’s recognition of the 40 REG
need for improvement within the service management operations at Financial 41 LON
6 Appendix D Sample Test Plan

01 Tradewinds Corporation (FTC). A mission element needs statement was developed


02 and approved that authorized the establishment of a new system called the Wall-
03 Street Financial Trading System (WFTS).
04 The project consists of several deliveries. Delivery 1 of the WFTS, which was
05 implemented recently, provided system foundation applications. Delivery 2 involves
06 the development of mission and support applications, which will enable FTC to
07 trade securities and various assets on Wall Street more effectively.
08 The test requirements definition for the WFTS project is driven by detailed
09 requirements/use cases/use case scenarios (see Section D.3.6) and by the evolu-
10 tionary nature of additional user input. Use case requirements are maintained within
11 the (DOORS) requirements management tool. Detailed WFTS use case require-
12 ments have been established for Delivery 2 and define test requirements and test
13 procedures. Test documentation—test plans, test procedures, and test results—is
14 captured and stored within DOORS. Additionally, PVCS Tracker will be used to
15 manage software problem reports.
16
17
D.1.3 System Overview
18
19 This section provides an overview of the WFTS and identifies critical and high-risk
20 functions of the system.
21
22 System Description. WFTS presently consists of a suite of hardware and software,
23 including nondevelopmental items (NDI)/Commercial off-the-shelf (COTS) and
24 developmental software. WFTS will provide FTC with daily trading and executive
25 decision-making support. Automation Services Incorporated (AMSI) developed
26 WFTS Delivery 1 and is under contract to develop and test Delivery 2. Figure D.1.1
27 depicts the WFTS Delivery 2 software architecture. Each block represents a software
28 component (configuration item) of the system. Table D.1.1 summarizes the WFTS
29 software components and their estimated COTS composition.
30
31 Critical and High-Risk Functions. During system requirements analysis and
32 requirements specification development, the AMSI test team participated in the
33 review of use case analysis results and WFTS joint application development (JAD)
34 sessions. Critical success and high-risk functions of the WFTS system were identi-
35 fied. These functions include those most critical to the mission of the system and
36 those that help mitigate the greatest risk to successful system operation. These func-
37 tions have been ranked in priority sequence, as shown in Table D.1.2. This under-
38 standing of functional importance serves as an input to test team prioritization of
ORT 39 test activities.
REG 40
ONG 41
D.1 Introduction 7

01
Active Trading Financial Portfolio
Asset Trading
Forecast and 02
Visibility Management Decision Support
03
04
Support Applications 05
06
Application Platform Cross-Functional 07
08
User Data Data System Security Distributed
Interface Management Interchange Network Management Guard Computing 09
10
11
Operating System 12
13
Figure D.1.1 WFTS Delivery 2 Software Architecture 14
15
16
17
Table D.1.1 WFTS Software Components 18
19
ID NDI/ 20
Number Description DI COTS D1 D2 21
22
OS-01 Operating system — COTS D1 —
23
UI-02 User interface — COTS D1 D2 24
DM-03 Data management DI — D1 D2 25
DI-04 Data interchange DI — D1 D2 26
NW-05 Network — COTS D1 — 27
SM-06 System management 20% 80% D1 D2 28
29
SG-07 Security guard — COTS — D2
30
DC-08 Distribution computing 30% 70% D1 D2 31
SA-09 Support applications 80% 20% — D2 32
TV-10 Active trade visibility 25% 75% — D2 33
FP-11 Financial portfolio management 20% 80% — D2 34
AT-12 Asset trading DI — — D2 35
36
DS-13 Forecasts and decision support DI — — D2
37
38
39 SHO
40 REG
41 LON
8 Appendix D Sample Test Plan

01 Table D.1.2 Critical and High-Risk Functions


02
Software
03
Rank Function Component Indicator
04
05 1 Verify identification of trading partner SG-07 High risk
06 account prior to any automated exchange
07
of asset trading information
08
09 2 Sort through asset trade opportunities AT-12 Critical
10 and identify the best-value trade and close
11 the deal on this trade
12 3 Provide communications and flow of SG-07 High risk
13 information between software components
14
operating at different levels of security
15
classification
16
17 4 Monitor exchange rates and primary DS-13 High risk
18 economic indicators for changes in the
19 securities market and the worldwide
20 economy
21 5 Monitor securities and the most TV-10 Critical
22
significant securities movement
23
24 6 Provide simulation modeling that produces DS-13 Critical
25 extended forecasts, analyze future of
26 evolving trends, and provide long-term
27 executive decision support
28
29
30
D.1.4 Applicable Documents
31
32 Documents that are pertinent to the WFTS Delivery 2 test program are listed in this
33 section.
34
35 Project Documentation
36 • System Requirements Specification, Delivery 2
37
• Use Case Scenario Document, Delivery 2
38
ORT 39
• Software Design Document, Delivery 2
REG 40 • Interface Design Document, Delivery 2
ONG 41
D.1 Introduction 9

• WFTS Statement of Work (SOW) 01


• Concept of Operations 02
03
• Management Plan
04
• Software Development Plan 05
• Security Test Plan, Delivery 1 06
• Test Plan, Delivery 1 07
08
• Test Report, Delivery 1
09
• Security Certification Test Report, Delivery 1 10
• Delivery 2 Kick-Off Meeting Presentation Slides 11
• Security Requirements and Design Review Meeting Materials, Delivery 2 12
13
• Security Review Meeting Report
14
• User Interface Review Presentation Slides, Delivery 2 15
• System Implementation Plan, Delivery 2 16
• Security Plan, Delivery 2 (draft) 17
18
• Security Test Plan, Delivery 2 (draft)
19
20
Standards Documentation
21
• Automated Test Life-Cycle Methodology (ATLM) 22
• Test Procedure Design and Development Standards 23
• IEEE/EIA 12207 Information Technology Software Life-Cycle Process 24
25
• AMSI Standards and Procedures (standard process supporting business
26
analysis phase, requirements phase, design phase, development phase, test-
27
ing phase, and maintenance phase)
28
• AMSI Code Inspection Process 29
• AMSI Programming Style Guide 30
• AMSI GUI Style Guide 31
32
• AMSI Usability Style Guide
33
34
Tool Documentation
35
• TeamTest (Test Management Tool) User Manual 36
• PVCS Tracker Documentation 37
• Performance Studio Documentation 38
39 SHO
• DOORS (Requirements Management Tool) User Manual
40 REG
41 LON
10 Appendix D Sample Test Plan

01 • PVCS (Configuration Management Tool) User Manual


02 • SystemArmor Security Guard Software Documentation
03
• UNIX Operating System Software Documentation
04
05 • InsitFul Securities Trade Visibility Software Documentation
06
07 D.1.5 Master Schedule
08
09 This section addresses the top-level schedule for the WFTS test program. The test
10 program schedule contains the major events, activities, and deliverables involved in
11 the test program. Activities performed by the test team include the design, develop-
12 ment, and execution of tests, as well as inspections of project documentation and
13 software products. The test team will also produce test documentation consisting of
14 the items listed in Table D.1.3.
15
16
17 Table D.1.3 Test Documentation
18 Test Program Due Date/
19 Document Description Timeframe
20
21 Test plan Test planning document (date)
Test verification summary A requirements traceability matrix that maps (date)
22
matrix test procedure coverage to test requirements
23 and specifies a test qualification method for
24 each system requirement
25 Test procedures The scripts used to perform/execute testing (timeframe)
26 Test and integration Minutes from test and integration working Periodic
27 working group meeting group meetings
28 minutes
Test development progress Metrics reports outlining the progress status Biweekly
29
reports of the test procedure development effort
30 Test readiness report or Report or presentation that outlines the (date)
31 presentation slides readiness of the test program to conduct
32 user acceptance testing
33 Test execution progress Reports that outline the status of test execution Biweekly
34 reports and other progress
35 and quality metrics
Defect tracking reports Reports that outline the number and severity Biweekly
36
of outstanding software problem reports
37 TPM status reports Reports that outline the progress of the system Biweekly
38 toward meeting defined technical performance
ORT 39 measures (TPM)
REG 40 Test report Report documenting the outcome of the test (date)
ONG 41
D.2 Roles and Responsibilities 11

01
Aug Sept Oct Nov Dec Jan Feb Mar Apr
02
03
04
Test Plan Integration Test Phase SAT 05
06
07
Unit Test Phase System Test Phase
Site 1 Site 2 08
Installation Installation
09
Code System
Walkthrough Walkthrough 10
Security Test 11
Test Procedure Script Development 12
13
14
15
Figure D.1.2 Test Program Milestone Schedule
16
17
18
The major events, activities, and documentation to be performed or prepared in 19
support of the WFTS test program are outlined in the test program milestone 20
schedule depicted in Figure D.1.2. 21
22
23
D.2 Roles and Responsibilities 24
Roles and responsibilities of the various groups are defined in this section. 25
26
27
D.2.1 Project Organization
28
Figure D.2.1 depicts the WFTS project organization. Reporting to the WFTS 29
project manager are four line supervisors: the software development manager, the 30
systems engineering manager, the product assurance manager, and the functional 31
requirements manager. The software development manager is responsible for soft- 32
ware and database design and development, as well as unit- and integration-level 33
software tests. The systems engineering manager leads the system architecture 34
design effort and is responsible for new COTS product evaluations. This manager 35
maintains the network that supports the system development and test environ- 36
ments, and is responsible for database administration of the deployed Delivery 1 37
WFTS system. The product assurance manager is responsible for test, configuration 38
management, and quality assurance activities. 39 SHO
The test manager is responsible for system test and user acceptance test activities 40 REG
supporting the WFTS system. The functional requirements manager is responsible 41 LON
12 Appendix D Sample Test Plan

01
Project
02 Manager
03
04
05
06 Software Systems Product Functional
Development Engineering Assurance Requirements
07 Manager Manager Manager Manager
08
09
10
11 Test Manager CM Manager QA Manager
12
13 Figure D.2.1 WFTS Project Organization
14
15
16 for requirements analysis, system requirements specification, and maintenance of
17 the requirements baseline. Functional analyst personnel also support development
18 and review of detailed design activities.
19
20
21 D.2.2 Project Roles and Responsibilities
22 D.2.2.1 Project Management
23
The project manager is responsible for client relations, project deliverables, sched-
24
ules, and cost accounting. He or she coordinates with the particular line manager
25
with regard to each technical task performed. The staff of project management spe-
26
cialists maintain project plans, schedules, and cost accounting information. Project
27
management is responsible for ensuring that standards and procedures are followed
28
and implemented appropriately.
29
30
D.2.2.2 Functional Requirements
31
32 The requirements group is responsible for requirements analysis and system require-
33 ments specification and for the derivation of subsequent use cases. This group also
34 supports development and review of detailed design activities.
35
36 D.2.2.3 Software Development
37 The software development group is responsible for software development, as well as
38 unit and integration software tests. It must develop software products in accordance
ORT 39 with software development standards and conventions as specified in the software
REG 40 development plan (SDP). The software development group also performs unit and
ONG 41
D.2 Roles and Responsibilities 13

integration test phase planning. The results of unit and integration test phase plan- 01
ning are then provided as input to Section D.3 of the test plan. 02
For software development items, each developer will maintain a systems devel- 03
opment folder (SDF) that contains the design documentation, printed copies of 04
lines of code and user screens generated, development status of the item, and test 05
results applicable to the item. 06
Test support responsibilities of the software development group include those 07
described here. 08
09
Software Product Design and Development. When designing and developing any 10
software or database product, the developer will comply with the software develop- 11
ment standards and conventions specified in the SDP. Certain SDP provisions are 12
automatically enforceable, such as the use of system development folders and com- 13
pliance with the procedures associated with the use of the product development 14
reuse library. Testability will be incorporated into the software as defined in the SDP. 15
The third-party controls (widgets) defined for the development of this system must 16
comply with the list of third-party controls that are compatible with the automated 17
testing tool. The test team will be informed of peer reviews and code walkthroughs 18
initiated by the development team. 19
20
Development Documentation. The development team will maintain SDFs. Embed- 21
ded within the lines of programming code will be documentation in the form of com- 22
ments. The embedded comments facilitate understanding of software structure and 23
define the purpose of software routines. They will trace or correlate to pseudocode so 24
as to facilitate software design traceability from the actual source code to the design 25
document. 26
27
Unit Test Phase. Developers will test individual software units with respect to their 28
function and integrity. Software unit program code will be analyzed to ensure that 29
the code corresponds to functional requirements. Tracing tools will minimize code 30
volume and eradicate dead code. Memory leakage tools will be applied, and code 31
coverage tools will be used to verify that all paths have been tested. The system test 32
team will perform unit testing in accordance with AMSI standards and procedures. 33
34
Integration Test Phase. Integration testing will be conducted to demonstrate the 35
consistency between the software design and its implementation in accordance with 36
AMSI standards and procedures. Its results will be recorded in the SDFs and 37
inspected for software quality assurance. When software modules are ready to sup- 38
port the integration and system test phases, the source code and all files required 39 SHO
for proper generation of the executables will be baselined within the software 40 REG
41 LON
14 Appendix D Sample Test Plan

01 configuration management tool. Each software build will be generated using the
02 source code products maintained within the software configuration management
03 tool. The system test team will perform integration testing and verify completeness
04 according to integration test procedures.
05 The software development group is also responsible for database design and
06 development and all data migration and synchronization activities. Additionally, it
07 helps the test group in setting up a test environment. The database group develops
08 the database in accordance with database development standards and conventions as
09 specified in the SDP.
10
11 D.2.2.4 Systems Engineering
12 The systems engineering group is responsible for the development of the system
13 architecture design, integration of COTS products, research of COTS products,
14 and evaluation of COTS products. As part of COTS integration, the systems engi-
15 neering group will be responsible for the design and development of software mod-
16 ules as well as testing of the integrated COTS products. The systems engineering
17 group will develop and maintain a simulation model of the WFTS using the
18 OPNET simulation tool. The WFTS simulation model will simulate the major func-
19 tions of the system and provide information on bottlenecks and queue buildups.
20 The systems engineering group maintains the network and hardware that work
21 with the system development and test environments, and is responsible for database
22 and system security administration of the deployed Delivery 1 WFTS system. The
23 group installs and configures COTS products as required to integrate them with the
24 rest of the system. The necessary parameters are defined for COTS products by this
25 group and then set to work in the target environment. Hardware is installed and
26 configured to reflect a typical end-user site. Upon receipt of the new system equip-
27 ment that is destined for deployment at an end-user site, the appropriate hardware
28 and system software configurations are installed.
29
30 D.2.2.5 Product Assurance
31
The product assurance group implements test, configuration, and quality assurance
32
activities. The system test team performs various test activities supporting the WFTS
33
system by following the ATLM. It takes responsibility for system test and user
34
acceptance test activities supporting the WFTS system; it also carries out the unit
35
and integration test phases described in Section D.3.
36
The system test team develops the test plan and procedures, and it performs the
37
tests necessary to ensure compliance with functional, performance, and other tech-
38
nical requirements. Test program activities include the maintenance of test automa-
ORT 39
tion reuse libraries, planning and execution of tests, and the development of test
REG 40
reports. These responsibilities are detailed below.
ONG 41
D.2 Roles and Responsibilities 15

Test Procedures Development. Test procedures will be prepared for system-level 01


testing that provide the inspector with a step-by-step (test script) operational guide 02
to performing each test. They will exercise both system software (COTS and devel- 03
opmental items) and hardware. 04
Test procedures will include the test procedure title, test description, references 05
to the system specification, prerequisite conditions for the test, test execution steps 06
(script), expected results, data requirements for the test, acceptance criteria, and 07
actual results. Those test procedures to be used for site acceptance testing will be 08
identified as a result of input from end users. 09
10
Unit and Integration Test Phase. The system test team will witness unit and inte- 11
gration test activities. 12
13
System Test Phase. The system test team is responsible for system testing; the
14
scope of this testing is described in Section D.3. The test team will document results
15
within the requirements management tool and produce progress reports, as detailed
16
in Section D.1.5.
17
System Acceptance Test (SAT) Phase. The system test team performs user accep- 18
tance testing, as described in Section D.3. The test team will document the results 19
within the requirements management tool and produce progress reports as specified 20
in Section D.1.5. 21
22
Test Reports. Raw test data and reports will be kept to indicate the specific 23
pass/fail results of all system hardware and software tests. The test team will prepare 24
a test report at the conclusion of system and user acceptance testing, which will 25
include the raw test data, reports, and a test results summary, together with conclu- 26
sions and recommendations. 27
28
Field/Site Acceptance Testing. This step will involve checkout and performance 29
testing to ensure that equipment and software are installed correctly. Test activities 30
will include verification that the system performs in accordance with specifications 31
and is capable of meeting operational requirements. Site acceptance tests will consist 32
of a reduced set of confirmation tests providing a reasonable check that the system is 33
ready for operation. 34
35
D.2.3 Test Task Structure 36
37
Table D.2.1 indicates the types of test tasks that may be performed by the system 38
test team for the WFTS test program. This task structure represents the work break- 39 SHO
down structure (WBS) that will be used by the test team to support cost accounting 40 REG
activities on the project. 41 LON
16 Appendix D Sample Test Plan

01 Table D.2.1 Test Program Work Breakdown Structure


02
Number Work Breakdown Structure (WBS) Element
03
04 1 Project Start-Up
05
1.1 Scope. Outline preliminary test goals and objectives.
06 1.2 Sizing. Perform test effort sizing.
07 1.3 Team composition. Undertake test team composition analysis and test engineer
08 job description development.
09 1.4 Recruiting. Develop test engineer recruiting advertisements and conduct
10 interviews.
11 2 Early Project Support
12
13 2.1 Goals/objectives. Further define test goals and objectives and review
goals/objectives with project management, development group, and test
14 engineers to develop understanding and acceptance of test goals and
15 objectives.
16 2.2 Constraint examination. Review project constraints, such as short time to market
17 or limited resources.
18 2.3 Testability review. Assure that testability is designed into the application.
2.4 Requirements review. Ensure that requirements are specified in terms that are
19
testable.
20 2.5 Review of standards. Identify and become acquainted with applicable standards.
21
22 3 Decision to Automate Test
23 3.1 Test objectives/strategies. Refine definition of test objectives for the project and
24 develop test strategies.
25 3.2 Test tool value. Outline the value/benefits derived from incorporating an
26 automated test tool.
3.3 Test tool proposal. Develop a test tool proposal.
27
28 4 Test Tool Selection and Evaluation
29
4.1 Systems engineering environment. Review organization’s systems engineering
30 environment.
31 4.2 Test tools available. Review types of test tools available.
32 4.3 Test tool candidates. Research, evaluate, and score test tool candidates.
33 4.4 Evaluation domain definition.
34 4.5 Hands-on tool evaluation.
4.6 Test tool evaluation report. Document tool selection and results of evaluations.
35
4.7 Test tool purchase. Develop purchase order and coordinate with the purchasing
36 department.
37
38
continued
ORT 39
REG 40
ONG 41
D.2 Roles and Responsibilities 17

continued from page 504 01


02
Number Work Breakdown Structure (WBS) Element 03
5 Test Tool Introduction 04
05
5.1 Test process. Implement (or modify existing) testing process, methodologies, and
“life-cycle” approach to testing to allow for the introduction of automated
06
testing tools. Assure that the test effort is performed in parallel with the 07
development effort. Maintain test tool introduction process. 08
5.2 Defect detection activities. Attend inspections and walkthroughs. 09
5.3 Test tool expertise. Participate in formal test tool training, review test tool 10
tutorials, and practice with test tool.
11
5.4 Test tool validation. Validate new test tool releases to ensure that the tool
performs according to specification and that it works in the particular 12
operating environment. 13
5.5 Test consultation. Create a test support hotline, answering questions within the 14
organization pertaining to the test process and tools. Provide mentoring and 15
coaching on automated software test discipline. 16
5.6 Test tool orientations. Provide presentations and demonstrations to orient
projects and personnel on the use and application of test tools.
17
5.7 Relationship building. Develop a working relationship with the development 18
group and facilitate communications among project team members. 19
5.8 Network environment setup. Consult on the setup of an automated test tool 20
repository on the local area network. Request additional network storage space 21
where necessary.
22
5.9 Defect management process. Establish process (workflow) for defect reporting
and resolution for a project. Outline applicable standards and formats. 23
5.10 Defect management training. Provide training on the process for defect reporting 24
and resolution. 25
5.11 Test tool reporting. Determine the types of automated test reports applicable to 26
the project. 27
6 Test Planning 28
29
6.1 Test requirements. Document application-under-test (AUT) test requirements.
30
6.2 Examination of constraints. Identify and outline constraints such as short time to
market and limited engineering resources. 31
6.3 Test goals/objectives. Document goals and objectives for testing (for example, 32
scalability, regression) within the test plan. Include goals pertaining to end- 33
user involvement in the test process. 34
6.4 Test strategy. Document the test strategies and the types of test tools that apply on 35
the project.
6.5 Test program activities. Develop a test strategy that incorporates test activities
36
early within the development life cycle. 37
6.6 Deliverables. Identify the product deliverables on the project that will be reviewed 38
or tested by test personnel. 39 SHO
continued 40 REG
41 LON
18 Appendix D Sample Test Plan

01 continued from page 505

02
03 Number Work Breakdown Structure (WBS) Element
04 6.7 Critical success functions. Work with project team and business users to identify
05 critical success functions and document them within the test plan.
6.8 Test program parameters. Define test program parameters such as assumptions,
06
prerequisite activities, system acceptance criteria, and test program risks and
07 document them within the test plan.
08 6.9 Level of quality. Work with project team and business users to determine the level
09 of quality for the project and document it within the test plan.
10 6.10 Test process. Document the test process within the test plan, including the test
11 tool introduction process and the defect management process.
6.11 Test training. Document test training requirements and plans within the test plan.
12 6.12 Decision to automate test. Document the assessment outlining the benefit of
13 using an automated test tool on the project, and the ability to incorporate an
14 automated test tool given the project schedule.
15 6.13 Technical environment. Document the technical environment in which the AUT
16 will be developed and eventually operate. Identify potential application design
or technical automated testing tool issues that may need to be resolved.
17
6.14 Test tool compatibility check. Document results of the test tool compatibility
18 check. Where an incompatibility problem arises, document work-around
19 solutions and alternative test methods.
20 6.15 Quality gates. Plan for the incorporation of quality gates.
21 6.16 Risk assessments. Perform risk assessments in support of project management
22 reviews and reporting requirements.
6.17 Test readiness reviews. Perform planning and analysis activities necessary for test
23 readiness reviews. Develop presentation slides and perform presentations
24 where required.
25 6.18 Test plan document. Assemble and package the test-planning documentation into
26 a test plan. Incorporate changes to the test plan as a result of test plan reviews
27 by project management and end users or customers. Maintain the test plan
document throughout the test life cycle.
28
6.19 Test data. Document test data requirements and plans for developing and
29 maintaining a test data repository.
30 6.20 Test environment. Identify requirements for a test laboratory or test environment
31 and identify the personnel who are responsible for setting up and maintaining
32 this environment.
33 6.21 Reporting requirements. Define reporting requirements and document them
within the test plan.
34 6.22 Roles and responsibilities. Define and document roles and responsibilities for the
35 test effort.
36 6.23 Test tool system administration. Outline the requirements for setting up and
37 maintaining the automated test tools and environment, and identify the
38 personnel who are responsible for setting up and maintaining the test tools.
Administration includes setup of tool users and various privilege groups.
ORT 39
continued
REG 40
ONG 41
D.2 Roles and Responsibilities 19

continued from page 506 01


02
Number Work Breakdown Structure (WBS) Element 03
7 Test Design 04
05
7.1 Prototype automated environment. Prepare and establish a test laboratory
environment to support test design and development.
06
7.2 Techniques and tools. Identify test techniques/strategies and automated tools to 07
be applied to the project application and its interfaces. 08
7.3 Design standards. Prepare and establish test procedure design standards. 09
7.4 Test procedure/script design. Develop a list and hierarchy of test procedures and 10
test scripts. Identify which procedures and scripts are to be performed
11
manually and which will requre an automated test tool.
7.5 Test procedure/script assignments. Assign test team personnel to the various test 12
procedures and scripts. 13
7.6 Inputs/outputs. Develop test procedure/script design inputs and expected 14
outputs. 15
7.7 Test automation script library. Identify test automation scripts contained with the 16
organization’s script library that can be applied to the project.
17
8 Test Development 18
19
8.1 Best practices/standards. Develop and tailor best practices and standards for test
development for the project. 20
8.2 Script creation standards. Implement test procedure script creation standards (for 21
example, comment out each automated testing tool scripting step, fill in test 22
procedure header file information, provide modularity, and so on). 23
8.3 Script execution standards. Implement test procedure execution standards (for 24
example, a consistent environment, test database backup, and rollback).
8.4 Test setup. Implement test procedure script strategies during the various testing
25
phases (for example, regression test phase, performance test phase). 26
8.5 Test procedure pseudocode. Prepare step-by-step pseudocode for the test 27
procedures. 28
8.6 Work-around solutions. Develop work-around solutions for tool/AUT 29
incompatibility problems.
30
8.7.1 Unit test phase test procedures/scripts. Witness execution of unit test procedures
and scripts. 31
8.7.2 Integration test phase test procedures/scripts. Witness execution of integration 32
test procedures and scripts. 33
8.7.3 System test phase test procedures/scripts. Develop test procedures and automate 34
scripts that support all phases of the system test cycle (that is, regression, 35
performance, stress, backup, and recoverability).
8.7.3.1 Develop a test procedure execution schedule.
36
8.7.3.2 Conduct automated test reuse analysis. 37
8.7.3.3 Conduct analysis to determine which tests to automate. 38
8.7.3.4 Develop a modularity relationship matrix. 39 SHO
continued 40 REG
41 LON
20 Appendix D Sample Test Plan

01 continued from page 507

02
03 Number Work Breakdown Structure (WBS) Element
04 8.7.4 Acceptance test phase test procedures/scripts. Develop and maintain test
05 procedures and scripts.
8.8 Coordination with the database group to develop test database environment.
06
Baseline and maintain test data to support test execution.
07 8.9 Test procedure peer reviews. Review test procedures against the script creation
08 standards (comments for each test tool scripting step, header file information,
09 modularity, and so on).
10 8.10 Reuse library. Develop and maintain a test procedure reuse library for the project.
11 8.11 Test utilities. Support the creation or modification of in-house test support
utilities that improve test effort efficiency.
12
13 9 Test Execution
14
9.1 Environment setup. Develop environment setup scripts.
15 9.2 Testbed Environment. Develop testbed scripts and perform testbed development
16 logistics.
17 9.3 System test phase execution. Execute test procedures as part of walkthroughs or
18 test demonstrations.
19 9.4 Acceptance test phase execution. Execute test procedures as part of walkthroughs
or test demonstrations.
20 9.5 Test reporting. Prepare test reports.
21 9.6 Issue resolution. Resolve daily issues regarding automated test tool problems.
22 9.7 Test repository maintenance. Perform test tool database backup/repair and
23 troubleshooting activities.
24
10 Test Management and Support
25
26 10.1 Process reviews. Perform a test process review to ensure that standards and the
27 test process are being followed.
10.2 Special training. Seek out training for test engineers for special niche test
28 requirements that become apparent during the test life cycle. Continue to
29 develop technical skills of test personnel.
30 10.3 Testbed configuration management (CM). Maintain the entire testbed/repository
31 (that is, test data, test procedures and scripts, software problem reports) in a
32 CM tool. Define the test script CM process and ensure that test personnel
work closely with the CM group.
33
10.4 Test program status reporting. Identify mechanisms for tracking test program
34 progress. Develop periodic reports on test progress. Reports should reflect
35 estimates to complete tasks in progress.
36 10.5 Defect management. Perform defect tracking and reporting. Attend defect
37 review meetings.
38 10.6 Metrics collection and analysis. Collect and review metrics to determine whether
changes in the process are required and to determine whether the product is
ORT 39
ready to be shipped.
REG 40
continued
ONG 41
D.3 Test Program 21

continued from page 508 01


02
Number Work Breakdown Structure (WBS) Element 03
11 Test Process Improvement 04
05
11.1 Training materials. Develop and maintain test process and test tool training
materials.
06
11.2 Review of lessons learned. Perform this review throughout the testing life cycle 07
and gather test life-cycle benefits information. 08
11.3 Metrics analysis and reporting. Analyze test process metrics across the 09
organization and report the results of this analysis. 10
11
12
D.2.4 Test Team Resources 13
14
The composition of the WFTS test team is outlined within the test team profile
15
depicted in Table D.2.2. This table identifies the test team positions on the project
16
together with the names of the personnel who will fill these positions. The duties to
17
be performed by each person are described, and the skills of the individuals filling
18
the positions are documented. The last two columns reflect the years of experience
19
for each test team member with regard to total test program experience as well as
20
years of experience with the designated test management tool for the project.
21
The WFTS test team includes both full-time resources and personnel who aid in
22
testing on a part-time basis. The phases supported by each test team member and
23
the availability during each test phase is outlined in Table D.2.3.
24
The WFTS test team will need to have a working knowledge of several tools for
25
its test program. Table D.2.4 outlines the experience of the test team members with
26
the test management, requirements management, configuration management, and
27
defect tracking tools. The last column indicates the training required for each test
28
team member.
29
30
D.3 Test Program 31
32
D.3.1 Scope
33
The WFTS test program is aimed at verifying that the Delivery 2 WFTS system sat- 34
isfies the requirements/derived use cases and is ready to be deployed in the FTC’s 35
production environment. The test program involves the implementation of a num- 36
ber of test strategies across several test phases, including unit, integration, system, 37
user acceptance, and site acceptance testing. 38
System-level test effort consists of functional testing, performance testing, backup 39 SHO
and recoverability testing, security testing, and verification of system availability 40 REG
41 LON
22 Appendix D Sample Test Plan

01 Table D.2.2 Test Team Profile


02
Test Test Tool
03
Experience Experience
04
Position Name Duties/Skills (years) (years)
05
06 Test manager Todd Jones Responsible for test program, 12 1.0
07 customer interface, recruiting,
test tool introduction, and
08
staff supervision.
09 Skills: MS Project, SQA Basic,
10 SQL, MS Access, UNIX, test
11 tool experience.
12 Test lead Sarah Wilkins Performs staff supervision, 5 3.0
13 cost/progress status reporting,
test planning/design/
14
development and execution.
15 Skills: TeamTest, Purify, Visual
16 Basic, SQL, SQA Basic, UNIX,
17 MS Access, C/C++, SQL Server.
18 Test engineer Tom Schmidt Performs test planning/design/ 2 0.5
19 development and execution.
Skills: Test tool experience,
20
financial system experience.
21 Test engineer Reggie Miller Performs test planning/design/ 2 1.0
22 development and execution.
23 Skills: Test tool experience,
24 financial system experience.
25 Test engineer Sandy Wells Performs test planning/design/ 1 —
26 development and execution.
Skills: Financial system
27 experience.
28 Test engineer Susan Archer Responsible for test tool 1 —
29 environment, network and
30 middleware testing. Performs
31 all other test activities.
32 Skills: Visual Basic, SQL,
CNE, UNIX, C/C++, SQL
33 Server.
34 Junior test Lisa Nguyen Performs test planning/design/ — —
35 engineer development and execution.
36 Skills: Visual Basic, SQL, UNIX,
37 C/C++, HTML, MS Access.
38
ORT 39
REG 40
ONG 41
D.3 Test Program 23

Table D.2.3 Test Team Personnel Availability 01


02
Position Name Test Phases Availability
03
Test manager Todd Jones Unit/Integration Test 100% 04
System Test/Acceptance Test 100% 05
06
Test lead Sarah Wilkins Unit/Integration Test 100%
07
System Test/Acceptance Test 08
Test engineer Tom Schmidt System Test/Acceptance Test 100% 09
Test engineer Reggie Miller System Test/Acceptance Test 50% 10
Test engineer Sandy Wells System Test/Acceptance Test 50% 11
Test engineer Susan Archer Unit/Integration Test 100% 12
13
System Test/Acceptance Test
14
Junior test Lisa Nguyen Unit/Integration Test 100%
15
engineer System Test/Acceptance Test 16
17
18
Table D.2.4 Test Team Training Requirements 19
20
Test Defect 21
Team Management RM CM Tracking Training 22
Member Tools Tool Tool Tool Required 23
24
Todd Jones ✓ ✓ ✓ ✓ None
25
Sarah Wilkins ✓ ✓ ✓ ✓ None
26
Tom Schmidt ✓ ✓ ✓ ✓ None 27
Reggie Miller ✓ ✓ ✓ ✓ None 28
Sandy Wells — ✓ ✓ ✓ TestStudio 29
Susan Archer — ✓ ✓ ✓ PerformanceStudio 30
Lisa Nguyen — — — — All four tools 31
32
33
34
measures. Separate security testing is applied to ensure that necessary security mech- 35
anisms perform as specified. Site acceptance testing will be performed in association 36
with site installation and checkout activities. 37
Tests will be comprehensive enough to cover the network, hardware, software 38
application, and databases. Software tests will focus on NDI/COTS and develop- 39 SHO
mental software. The unit and integration test phases will involve tests of newly 40 REG
41 LON
24 Appendix D Sample Test Plan

01 created or modified software as well as COTS products incorporated in WFTS


02 Delivery 2 development effort, as noted in Table D.1.1. System and user acceptance
03 tests will exercise Delivery 2 development products and perform a regression testing
04 on the existing Delivery 1 application software. Thus the complete WFTS system
05 will be reviewed.
06
07
D.3.2 Test Approach
08
09 When developing the WFTS test approach, the test team reviewed system require-
10 ments/derived use cases and use case scenarios; it also studied the system descrip-
11 tion and critical/high-risk function information described in Section D.1.3. Using
12 this information, the test team performed a test process analysis exercise to identify
13 a test life cycle. In addition, it analyzed the test goals and objectives that could be
14 applied on the WFTS test effort. The results of these analyses appear in Table D.3.1.
15
16
17 Table D.3.1 Test Process Analysis Documentation
18
Process Review
19
20 • The project will use the organization’s standard test process that adopts the ATLM.
• To ensure a smooth implementation of the automated test tool, the project will take advantage
21
of the ATLM test tool introduction process.
22
23 Test Goals
24 • Increase the probability that the AUT will behave correctly under all circumstances.
25 • Detect and support the removal of all defects in the AUT by participating in defect prevention
26 activities and conducting defect detection activities, as defined in the test strategy.
27 • Increase the probability that the AUT meet all defined requirements.
28 • Perform test activities that support both defect prevention and defect removal.
• Be able to execute a complete test of the application within a short timeframe.
29
• Incorporate a test design that minimizes test script rework following changes to the
30 application.
31
32 Test Objectives
33 • Ensure that the system complies with defined client and server response times.
34 • Ensure that the most critical end-user paths through the system perform correctly.
35 • Identify any significant defects in the system, track software problem reports, and verify closure
36 of all significant software problem reports.
• Ensure that user screens perform correctly.
37
• Ensure that system changes have not had an adverse effect on existing software modules.
38 • Use automated test tools, whenever possible, to provide high test program return.
ORT 39 • Incorporate test design and development that minimizes test script rework following changes
REG 40 to the application.
ONG 41
D.3 Test Program 25

In addition to identifying test goals and objectives, the test team documented 01
test program parameters, including its assumptions, prerequisites, system acceptance 02
criteria, and risks. 03
04
D.3.2.1 Assumptions 05
The test team developed this plan with the understanding of several assumptions 06
concerning the execution of the WFTS project and the associated effect on the test 07
program. 08
09
Test Performance. The test team will perform all tests on the WFTS project with 10
the exception of those unit and integration phase tests, which are performed by the 11
system developers and witnessed by the system test group. 12
13
Security Testing. System security tests, designed to satisfy the security test 14
requirements outlined within the security test plan, will be executed during system 15
testing and will be incorporated into the test procedure set constituting the system 16
acceptance test (SAT). 17
18
Early Involvement. The test team will be involved with the WFTS application 19
development effort from the beginning of the project, consistent with the ATLM. 20
Early involvement includes the review of requirement statements and use cases/use 21
case scenarios and the performance of inspections and walkthroughs. 22
23
Systems Engineering Environment. The suite of automated tools and the test 24
environment configuration outlined within this plan are based upon existing sys- 25
tems engineering environment plans outlined within the WFTS management plan 26
and the software development plan. Changes in the systems engineering environ- 27
ment will require subsequent and potentially significant changes to this plan. 28
29
Test Team Composition. The test team will include three business area functional 30
analysts. These analysts will be applied to the system test effort according to their 31
functional area expertise. While these analysts are on loan to the test group, they will 32
report to the test manager regarding test tasks and be committed to the test effort. 33
They will support the test effort for the phases and percentages of their time as 34
noted in section D.2.4. 35
36
Test Limitations. Given the resource limitations of the test program and the limit- 37
less number of test paths and possible input values, the test effort has been designed to 38
focus effort and attention on the most critical and high-risk functions of the system. 39 SHO
Defect tracking and its associated verification effort, likewise, focus on assessing 40 REG
41 LON
26 Appendix D Sample Test Plan

01 these functions and meeting acceptance criteria, so as to determine when the AUT
02 is ready to go into production.
03
04 Project Schedule. Test resources defined within the test plan are based upon the
05 current WFTS project schedule and requirement baseline. Changes to this baseline
06 will require subsequent changes to this plan.
07
08 D.3.2.2 Test Prerequisites
09 The WFTS test program schedule depicted in Figure D.1.2 includes the conduct of
10 a system walkthrough. This walkthrough involves a demonstration that system test
11 procedures are ready to support user acceptance testing.
12 The conduct of this walkthrough and subsequent performance of SAT requires
13 that certain prerequisites be in place. These prerequisites may include activities,
14 events, documentation, and products. The prerequisites for the WFTS test program
15 execution are as follows:
16
17 • The full test environment configuration is in place, operational, and under
18 CM control.
19 • The test data environment has been established and baselined.
20 • All detailed unit and integration test requirements have been successfully
21 exercised as part of the unit and integration test phases.
22
• Materials supporting test-by-inspection and certification methods are on
23
hand. Materials representing evidence of test-by-analysis are on hand.
24
25 • The system test procedure execution schedule is in place.
26 • Automated test procedure reuse analysis has been conducted.
27 • A modularity-relationship model has been created.
28
• System test procedures have been developed in accordance with standards.
29
30 • The WFTS system baseline software has been installed in the test environ-
31 ment and is operational.
32
33 D.3.2.3 System Acceptance Criteria
34 The WFTS test program within the AMSI test environment concludes with the sat-
35 isfaction of the following criteria. In accordance with the test schedule depicted in
36 Figure D.1.2, two site acceptance tests are performed following completion of these
37 criteria.
38
• SAT has been performed.
ORT 39
REG 40 • Priority 1–3 software problem reports reported during SAT and priority
ONG 41 2–3 software problem reports that existed prior to SAT have been resolved.
D.3 Test Program 27

The test group has verified the system corrections implemented to resolve 01
these defects. 02
• A follow-up SAT has been conducted, when required, to review test proce- 03
dures associated with outstanding priority 1–3 software problem reports. 04
Successful closure of these software problem reports has been demon- 05
strated. 06
07
• A final test report has been developed by the test team and approved by
08
FTC.
09
10
D.3.2.4 Risks
11
Risks to the test program (see Table D.3.2) have been identified, assessed for their 12
potential effects, and then mitigated with a strategy for overcoming the risk should 13
it be realized. 14
15
D.3.3 Test Strategies 16
17
Drawing on the defined test goals and objectives and using the ATLM as a baseline, 18
the test team defined the test strategies that will be applied to support the WFTS 19
test program. The test team will utilize both defect prevention and defect removal 20
technologies as shown in Table D.3.3. 21
The AMSI test team will execute the SAT. It will develop test threads to exercise 22
the requirements specified in the detailed requirements/use case documents. The 23
test procedures will specify how a test engineer should execute the test by defining 24
the input requirements and the anticipated results. The detail of this information is 25
controlled through the DOORS test tool and is available on-line. The DOORS 26
database serves as the repository for system requirements and test requirements. 27
The DOORS requirements management tool is used for managing all systems 28
requirements, including business, functional, and design requirements. It is also 29
used for capturing test requirements and test procedures, thus allowing for simple 30
management of the testing process. Using the DOORS scripting language and the 31
associated .dxl files, the test team can automatically create a traceability matrix that 32
will measure the coverage progress of test procedures per test requirements. In turn, 33
test procedures will be derived from the detailed business requirements and use 34
cases and stored in the DOORS database. 35
The highest-risk functionality has been identified, and the test effort will focus 36
on this functionality. Reuse analysis will be conducted of existing test procedures 37
to avoid rework of automated test procedures available from previous testing 38
efforts. If the automated test tool is not compatible with some of the functionality 39 SHO
and no feasible automation work-around solutions can be found, tests will be exe- 40 REG
cuted manually. 41 LON
28 Appendix D Sample Test Plan

01 Table D.3.2 Test Program Risks


02
Risk Mitigation
03
Number Title Description Effect Strategy
04
05 1 COTS Method of testing 150 Producer of InsitFul has
06 Testing requirements that will be additional been cooperative. Plans in
supported by the COTS test hours place to test compatibility
07
tool InsitFul has not $12,000 cost of product. Additional
08 been resolved. Issue: 2 weeks help by supplier is under
09 certification versus test schedule slip negotiation.
10 qualification method.
11 Also unclear whether
12 automated test tool is
compatible with
13 InsitFul GUI.
14 2 Security Security plan and security 50 hours test Potential system developer
15 Testing test plan are both in rework lined up to support test
16 draft form. Security $4,000 cost procedure rework and
17 requirements not 2–4 weeks peak load period.
18 finalized. schedule slip
3 Require- Requirements pertaining 2–4 weeks Monitoring situation. No
19
ment to the Asset Trade schedule slip mitigation strategy
20 Changes component experienced identified.
21 late changes.
22 Development staff behind
23 schedule on this
24 component.
4 COTS The Financial Portfolio 40 hours test Test team working from
25
Docu- COTS product slated for rework documentation for the
26 menta- use is a beta version, $3,200 cost previous release of the
27 tion and no product 2 weeks product and attempting to
28 documentation exists. schedule slip identify differences over
29 the phone.
30 5 Require- Requirements definition 60 hours test Test team working with
ments for the Asset Trade rework functional analysts to
31
Defini- component is at a high $4,800 cost attempt to obtain greater
32 tion level, and test 2–4 weeks definition through more
33 requirements are schedule slip detailed use case analyses.
34 unclear.
35 . . . . .
. . . . .
36
. . . . .
37
38
ORT 39
REG 40
ONG 41
D.3 Test Program 29

Table D.3.3 Test Strategies and Techniques 01


02
Defect Prevention Technologies
03
➢ Examination of Constraints 04
➢ Early Test Involvement 05
06
➢ Use of Standards
07
➢ Inspections and Walkthroughs 08
➢ Quality Gates 09
10
Defect Removal Technologies
11
➢ Inspections and Walkthroughs 12
➢ Testing of Product Deliverables 13
➢ Designing Testability into the Application 14
15
➢ Use of Automated Test Tools
16
➢ Unit Test Phase: Error Handling, Memory Leak, Path Coverage, Fault 17
Insertion, Decision Coverage 18
➢ Integration Test Phase: Integration Testing 19
➢ System Test Phase: Functional, Security, Stress/Volume, Performance, 20
Usability 21
22
➢ Acceptance Test Phase: Functional, Security, Stress/Volume, Performance,
23
Usability
24
➢ Strategic Manual and Automated Test Design 25
➢ Execution and Management of Automated Testing 26
➢ Random Testing 27
➢ Test Verification Method 28
➢ User Involvement 29
30
31
32
A modularity model will be created that depicts the relationships among the test 33
procedures. Test procedures will be broken down and assigned to the various test 34
engineers, based on the requirements category and the test engineer’s business 35
knowledge and expertise. Progress will be monitored and test procedure walk- 36
throughs will be conducted to verify the accuracy of test procedures and to discover 37
any discrepancies with the business requirement. 38
The WFTS system will be modeled for scalability using the simulation modeling 39 SHO
tool OPNET. This model will simulate the major functions of the WFTS system and 40 REG
41 LON
30 Appendix D Sample Test Plan

01 provide information about bottlenecks and queue buildups. Inputs to OPNET


02 include arrival rates of the various transactions, the sizes of the transactions, and the
03 processing times at the various stages of the process flow. After the model is built, it
04 must be validated against the test data obtained from the performance testing
05 process. Once this validation is complete, the model can be used to examine what-if
06 scenarios and to predict performance under varying conditions.
07
08
D.3.4 Automated Tools
09
10 The test team for the WFTS project will use the automated test tools listed in Table
11 D.3.4. The development team uses the PureCoverage and Purify tools during unit
12 testing. During system acceptance testing, the test team will use TestStudio. The
13 application will be analyzed for functionality that lends itself to automation. This
14 strategy will streamline the process of creating and testing certain redundant trans-
15 actions. Test scripts will be developed following the test procedure development
16 guidelines defined in Appendix D.A.
17 If software problems are detected, the team will generate defect reports. Soft-
18 ware problem reports will be reported to system developers through PVCS Tracker.
19 The DOORS database supports the FTC repository for system requirements, test
20 requirements, and related software problem reports.
21 TestStudio will be used as the GUI automated test tool. DOORS will serve as
22 the requirements management tool. Performance Studio will be used for perfor-
23 mance and stress testing. TestStudio Test Procedure (Case) Generator will be used
24 to create a baseline of test procedures.
25
26
27
28 Table D.3.4 Automated Test Tools
29 Activity/ Task Automated Test
30
31 Business Modeling Rational Rose
32 Simulation Modeling OPNET
33 Requirements Management DOORS
34 Load Testing Performance Studio
35
Test Management TestStudio
36
37 Configuration Management PVCS
38 Defect Tracking PVCS Tracker
ORT 39 GUI Testing TestStudio
REG 40
ONG 41
D.3 Test Program 31

D.3.5 Qualification Methods 01


02
For each test requirement, a testability indicator/qualification method will be used.
03
The following qualification methods will be employed in test procedure steps to ver-
04
ify that requirements have been met:
05
• Inspection. Inspection verifies conformance to requirements by visual exam- 06
ination, review of descriptive documentation, and comparison of the actual 07
characteristics with predetermined criteria. 08
• Demonstration. Demonstration verifies conformance to requirements by 09
exercising a sample of observable functional operations. This method is 10
appropriate for demonstrating the successful integration, high-level func- 11
tionality, and connectivity provided by NDI and COTS software. NDI and 12
COTS products are certified by vendors to have been developed and tested 13
in accordance with software development and quality processes. 14
15
• Tests. Testing verifies conformance to requirements by exercising observable
16
functional operations. This method is generally more extensive than that
17
used in demonstrations and is appropriate for requirements fulfilled by
18
developmental items.
19
• Manual Tests. Manual tests will be performed when automated tests are not 20
feasible. 21
• Automated Tests. When automation analysis outcome is positive, the test 22
procedures will be automated. 23
• Analysis. Analysis verifies conformance to requirements by technical evalu- 24
ation, processing, review, or study of accumulated data. 25
26
• Certification. Certification verifies conformance to requirements by exami-
27
nation of vendor (or supplier) documentation attesting that the product was
28
developed and tested in accordance with the vendor’s internal standards.
29
30
D.3.6 Test Requirements 31
32
Test requirements have been derived from requirements/use cases/use case scenar-
33
ios developed for the application. In the requirements traceability matrix maintained
34
within the DOORS database, system requirements are mapped to test requirements.
35
The test team worked with the project manager and development team to prioritize
36
system requirements for testing purposes. The test team entered the priority values
37
within DOORS, as shown in the test verification summary matrix depicted in
38
Appendix D.B.
39 SHO
40 REG
41 LON
32 Appendix D Sample Test Plan

01 D.3.7 Test Design


02
D.3.7.1 Test Program Model
03
04 Armed with a definition of test requirements and an understanding of the test tech-
05 niques that are well suited to the WFTS test program, the test team developed the
06 test program model, which depicts the scope of the test program. The model
07 includes test techniques that will be employed at the development test and system
08 test levels as well as the applicable static test strategies, as shown in Figure D.3.1.
09
10 D.3.7.2 Test Architecture
11 Having defined a test program model, the test team next constructed a test archi-
12 tecture for the WFTS project. The test architecture depicts the structure of the
13 test program, defining the way that test procedures will be organized in the test
14 effort. Figure D.3.2 depicts the test architecture for the WFTS project, where
15 development-level tests are design-based and system-level tests are technique-based.
16 The design components shown in Figure D.3.2 were retrieved by the test team
17 from the project’s software architecture. Five components are being tested at the
18 development level: System Management (SM-06), Security Guard (SG-07), Distrib-
19 uted Computing (DC-08), Support Applications (SA-09), and Active Trade Visibil-
20
21
22
23 Test Program Model
24
Static Test Strategies Other Qualification Methods
25
26 • Requirements Review • Demonstration
27 • Product Deliverable Test • Analysis
28 • Design Review Participation • Inspection
29 • Inspections and Walkthroughs • Certification
30
31
Development-Level Techniques System-Level Techniques
32
33 • Error Handling • Functional Testing
34 • Memory Leak • Security Testing
35 • Path Coverage • Stress/ Volume Testing
36 • Fault Insertion • Performance Testing
37 • Decision Coverage • Usability Testing
38
ORT 39
REG 40 Figure D.3.1 Test Program Model
ONG 41
D.3 Test Program 33

01
Test Architecture Development Test Level 02
03
SM-06 SG-07 DC-08 SA-09 TV-10
Error Error Error Error Error 04
Handling Handling Handling Handling Handling 05
Memory Leak Memory Leak Memory Leak Memory Leak Memory Leak 06
Path Coverage Path Coverage Path Coverage 07
Fault Fault Fault 08
Insertion Insertion Insertion
Decision Decision Decision
09
Coverage Coverage Coverage 10
11
System Test Level 12
13
Functional Security Stress/ Volume Performance Usability
SM-06 SM-06 TV-10 TV-10 SM-06 14
SG-07 SG-07 SG-07 15
DC-08 and DC-08 16
SA-09 Security Plan SA-09 17
TV-10 Requirements TV-10 18
19
Figure D.3.2 Sample Test Architecture 20
21
22
23
ity (TV-10). For each of these design components, the test techniques that will be 24
applied are noted. 25
26
D.3.7.3 Test Procedure Definition 27
A preliminary step in the test design process involves the development of the test 28
procedure definition, which aids in test development and helps to bound the test 29
effort. The test procedure definition identifies the suite of test procedures that must 30
be developed and executed for the test effort. The design exercise involves the orga- 31
nization of test procedures into logical groups and the allocation of test procedure 32
number series for each set of tests required. 33
Table D.3.5 depicts a sample test procedure definition for development-level 34
tests. Column 1 of this table identifies the series of test procedure numbers allotted 35
for testing of the particular design component using the particular technique. Col- 36
umn 2 lists the software or hardware design components to be tested. The design 37
components referenced are retrieved from the test architecture. The test technique 38
is listed in column 3, and the number of test procedures involved in each set of tests 39 SHO
(row) is estimated in column 4. 40 REG
41 LON
34 Appendix D Sample Test Plan

01 Table D.3.5 Test Procedure Definition (Development Test Level)


02
TP Design Number
03
Numbers Component Test of Test
04
Allocated ID Technique Procedures
05
06 100–150 SM601–SM634 Error Handling 35
07 151–199 Memory Leak 35
08
200–250 SG701–SG728 Error Handling 30
09
10 251–299 Memory Leak 30
11 300–350 DC801–DC848 Error Handling 50
12 351–399 Memory Leak 50
13 400–599 Path Coverage 200
14 600–650 Fault Insertion 50
15
651–849 Decision Coverage 200
16
850–899 SA901–SA932 Error Handling 35
17
18 900–950 Memory Leak 35
19 951–1150 Path Coverage 200
20 1151–1199 Fault Insertion 35
21 1200–1399 Decision Coverage 200
22 1400–1450 TV1001–TV1044 Error Handling 45
23
1451–1499 Memory Leak 45
24
25 1500–1699 Path Coverage 200
26 1700–1750 Fault Insertion 45
27 1751–1949 Decision Coverage 200
28 1950–1999 Integration Test 25
29 Total = 1,745
30
31
32 Table D.3.6 depicts a sample test procedure definition for system-level tests.
33 Column 1 of this table identifies the series of test procedures allotted for each par-
34 ticular test technique. Column 2 lists the test technique.
35 Columns 3 through 5 provide information to specify the number of test proce-
36 dures involved at the system test level. The number of design units or functional
37 threads required for the tests is given in column 3. Four functional threads are planned
38 to support stress and performance testing. Note that usability tests will be conducted
ORT 39 as part of functional testing; as a result, no additional test procedures are needed for
REG 40 this test technique. The number of system requirements involved in the tests is identi-
ONG 41 fied in column 4, and the number of test requirements is given in column 5.
D.3 Test Program 35

Table D.3.6 Test Procedure Definition (System Test Level) 01


02
Number Number Number
03
TP of Units of System of Test Number
04
Numbering Test or Require- Require- of Test
05
Convention Technique Threads ments ments Procedures
06
2000–2399 Functional 186 220 360 360 07
2400–2499 Security 62 70 74 74 08
09
2500–2599 Stress 4 12 24 96
10
2600–2699 Performance 4 14 14 56 11
— Usability 186 4 4 — 12
586 13
14
15
Table D.3.7 Test Procedure Naming Convention 16
Design Test 17
TP Naming Component/ Procedure 18
Convention Test Technique Test Level Estimate 19
20
WF100–WF199 Systems Management (SM) Development 70 21
WF200–WF299 Security Guard (SG) Development 60 22
WF300–WF849 Distributed Computing (DC) Development 550 23
24
WF850–WF1399 Support Applications (SA) Development 505
25
WF1400–WF1949 Active Trade Visibility (TV) Development 535
26
WF1950–WF1999 Integration Test Development 25 27
WF2000–WF2399 Functional/Usability Tests System 360 28
WF2400–WF2499 Security System 74 29
WF2500–WF2599 Stress System 96 30
WF2600–WF2699 Performance System 56 31
32
WF2700 System Test Shell System 1
33
34
35
The last column estimates the number of test procedures required for each test 36
technique listed. For functional and security testing, there may be one test proce- 37
dure for every test requirement. For stress and performance testing, four threads are 38
planned that will need to be altered for each test procedure to examine different sys- 39 SHO
tem requirements. 40 REG
41 LON
36 Appendix D Sample Test Plan

01 D.3.7.4 Test Procedure Naming Convention


02 With the test procedure definition in place for both the development and system
03 levels, the test team adopted a test procedure naming convention to uniquely iden-
04 tify the test procedures on the project. Table D.3.7 provides the test procedure
05 naming scheme for the WFTS project.
06 With the various tests defined, the test team identified the test procedures that
07 warrant automation and those that can be performed most efficiently via manual
08 methods. Table D.3.8 depicts a portion of a traceability matrix that is maintained
09 using DOORS, which breaks down each test procedure required for system-level
10 testing. Each test procedure in Table D.3.8 is cross-referenced to several other ele-
11 ments, such as design component and test technique. The last column identifies
12 whether the test will be performed using an automated test tool (A) or manually (M).
13
14 D.3.8 Test Development
15
16 Tests are automated based on the automation analysis outcome of the test design
17 phase, as shown in Table D.3.8. They are developed in accordance with the test pro-
18
19
20 Table D.3.8 Automated versus Manual Tests
21 TP Design Test
22 Number Component Technique SR ID SWR ID TR ID A/M
23
24 . . . . . . .
25 . . . . . . .
26 . . . . . . .
27 2330 TV1016 Functional 3.2.3c TV029 2220 A
28
2331 TV1016 Functional 3.2.3c TV030 2221 A
29
30 2332 TV1016 Functional 3.2.3c TV031 2412 M
31 2333 TV1017 Functional 3.2.3d TV032 2222 A
32 2334 TV1017 Functional 3.2.3d TV033 2412 A
33 2335 TV1018 Functional 3.2.3e TV034 2223 A
34 2336 TV1018 Functional 3.2.3e TV035 2412 M
35
2337 TV1019 Functional 3.2.3f TV036 2224 A
36
2338 TV1019 Functional 3.2.3g TV037 2412 A
37
38 2339 TV1019 Functional 3.2.3g TV038 2225 A
ORT 39 . . . . . . .
REG 40 . . . . . . .
ONG 41 . . . . . . .
D.3 Test Program 37

Configuration 01
Management 02
Automation 03
Infrastructure 04
Develop Automated
05
Peer Review
Test Procedures 06
Calibration of the Test Tool Compatibility
07
Test Tool Work-Around Solutions 08
09
Test Procedure Modularity Relationship Automated Testing Tool
Execution Schedule Analysis User Manual 10
11
Manual Test Procedures Test Design Test Development Automation
(Test Plan) Standards Standards Reuse Analysis 12
13
Technical Environment Environment Readiness
Facilities and Hardware Checks 14
15
Figure D.3.3 Test Development Architecture 16
17
Table D.3.9 Automation Reuse Analysis 18
19
TP Design Test SR SWR TR Reuse 20
Number Component Technique ID ID ID A/M Asset 21
2330 TV1016 Functional 3.2.3c TV029 2220 A — 22
2331 TV1016 Functional 3.2.3c TV030 2221 A MMS2079 23
2332 TV1016 Functional 3.2.3c TV031 2412 M — 24
2333 TV1017 Functional 3.2.3d TV032 2222 A — 25
2334 TV1017 Functional 3.2.3d TV033 2412 M —
26
2335 TV1018 Functional 3.2.3e TV034 2223 A LW2862
2336 TV1018 Functional 3.2.3e TV035 2412 M — 27
2337 TV1019 Functional 3.2.3f TV036 2224 A — 28
2338 TV1019 Functional 3.2.3g TV037 2225 A ST2091 29
2339 TV1019 Functional 3.2.3g TV038 2226 A ST2092 30
. . . . . . . .
. . . . . . . .
31
. . . . . . . . 32
33
34
cedure execution schedule and the modularity-relationship model. Test develop- 35
ment must be consistent with the test development guidelines provided in Appendix 36
D.A. Additionally, test procedures will be developed using the automatic test proce- 37
dure generation feature of the TestStudio test tool. 38
The test team prepared a test development architecture, depicted in Figure 39 SHO
D.3.3, that provides a clear picture of the test development activities (building 40 REG
41 LON
38 Appendix D Sample Test Plan

01 blocks) necessary to create test procedures. The test development architecture illus-
02 trates the major activities to be performed as part of test development.
03 To conduct its test development activities efficiently, the test team performed an
04 analysis to identify the potential for reuse of existing test procedures and scripts
05 within the AMSI automation infrastructure (reuse library). The results of this reuse
06 analysis are maintained using the DOORS tool and are depicted in Table D.3.9.
07
08
09
D.4 Test Environment
10 D.4.1 Test Environment Configuration
11
The test environment mirrors the production environment. This section describes
12
the hardware and software configurations that compose the system test environ-
13
ment. The hardware must be sufficient to ensure complete functionality of the soft-
14
ware. Also, it should support performance analysis aimed at demonstrating field
15
performance. Information concerning the test environment pertinent to the appli-
16
cation, database, application server, and network is provided below.
17
18
Application
19
20 Visual Basic 5.0
21
Iona’s Orbix V2.3
22
23 Microsoft’s Internet Information Server
24 Neonet V3.1
25 MQ Series V.20
26
Windows NT V4.0 service pack 3
27
28
Application Server
29
30 Dual-processor PC, 200MHz Pentium processors
31
256MB Memory
32
33 4–6GB hard disk, CD-ROM drive
34 2 Syngoma 503E SNA boards
35 Microsoft SNA Server 3.0
36
Digital DCE 1.1C with Eco patch
37
38 Encina 2.5 with patches
ORT 39 Windows NT 4.0 with service pack 3
REG 40
ONG 41
D.4 Test Environment 39

Database 01
02
Sybase 11 Server V11.x.1 application server 03
Microsoft’s SNA Server V4.0 04
Digital DCE Client and Server with Eco patch V1.1c 05
06
Encina V2.5 with patches
07
Workstation 08
Windows NT V4.0 service pack 3 09
Iona’s Orbix V2.3 10
11
Sybase Configuration 12
13
Application: Sybase 11 Open Client CT-Lib V11.1.0 14
Database: Sybase 11 Server V11.x.1 15
16
Sun Solaris for the database server
17
18
Network Configuration
19
Ethernet switched network 20
21
Baseline test laboratory equipment for WFTS central site configurations was 22
acquired for development and testing performed in support of Deliver 1 WFTS 23
system. Delivery 2 requirements involve additional functionality, and as a result of 24
the scope of the test effort must be modified accordingly. Two site configurations 25
must be added to the WFTS test lab configuration. The procurement of addi- 26
tional hardware and software resources is reflected in the test equipment list given in 27
Table D.4.1. 28
29
30
D.4.2 Test Data 31
Working in conjunction with the database group, the test team will create the test 32
database. The test database will be populated with unclassified production data. The 33
configuration management group will baseline the test environment, including the 34
test database. Additionally, during performance testing, test data will be generated 35
using Rational’s Performance Studio tool. These data will be baselined in the PVCS 36
configuration management tool. To assure adequate testing depth (volume of test 37
database of 10 records versus 10,000 records), the test team will mirror the 38
production-size database during performance testing. To assure adequate testing 39 SHO
40 REG
41 LON
40 Appendix D Sample Test Plan

01 Table D.4.1 Test Equipment Purchase List


02
Annual
03
Product Product Unit Mainte-
04
Site Requirement Description Vendor Quantity Cost nance
05
06 Site 1 Application server Compaq Compaq 1 (cost) (cost)
07 ProLiant 6500
Site 1 Communication Compaq Compaq 1 (cost) (cost)
08
server ProLiant 1600
09 Site 1 Database server Sun Workstation Sun 1 (cost) (cost)
10 Site 1 Server operating Windows NT Microsoft 2 (cost) (cost)
11 system
12 Site 1 Server operating Sun Solaris Sun 1 (cost) (cost)
13 system
Site 1 Database Sybase Server Sybase 1 (cost) (cost)
14 management
15 system (DBMS)
16 Site 1 CORBA server Iona Orbix Iona 1 (cost) (cost)
17 . . . . . . .
. . . . . . .
18
. . . . . . .
19
20
21
22 Table D.4.2 System Test Data Definition
23
TP Design Data
24
Number Component Requirement Description
25
. . . .
26
. . . .
27 . . . .
28 2330 TV1016 Database tables Screen inputs
29 2331 TV1016 Variable input Range of data values (see test requirement)
30 2332 TV1016 Variable input Range of data values (see test requirement)
31 2333 TV1017 Data object Requires a bitmapped TIFF data object
32 2334 TV1017 Variable input Range of data values (see test requirement)
2335 TV1018 Database tables Screen inputs
33 2336 TV1018 — Printer output test using existing data.
34 2337 TV1019 Data object Requires a bitmapped TIFF data object
35 2338 TV1019 Variable input Range of data values (see test requirement)
36 2339 TV1019 Database tables Screen inputs
. . . .
37
. . . .
38 . . . .
ORT 39
REG 40
ONG 41
D.5 Test Execution 41

breadth (variation of data values), it will use data with many variations, again mir- 01
roring the production data environment. Test data will use the procedure data defi- 02
nitions, whenever possible. 03
Table D.4.2 is a matrix that cross-references test data requirements to each indi- 04
vidual test procedure that is planned for system testing. 05
06
07
D.5 Test Execution 08
D.5.1 Test Program Reporting 09
10
An earned value management system will be used to track test program progress,
11
including cost and schedule measures. Earned value involves tracking of the value of
12
completed work relative to planned costs and actual costs, so as to provide a true
13
measure of cost status and to enable AMSI’s personnel to define effective corrective
14
actions. Four primary steps make up the earned value process:
15
1. Identify short tasks (functional test phase). 16
2. Schedule each task (task start date and end date). 17
18
3. Assign a budget to each task (task will require 3,100 hours using four test engi-
19
neers).
20
4. Measure the progress of each task (schedule and cost variance). 21
The primary tasks to be performed by the test team have been identified consis- 22
tent with the work breakdown structure outlined in Table D.2.1. A detailed test 23
schedule has been prepared identifying each task. For each task, timeframes have 24
been determined and hours and personnel have been allocated. The SAT test execu- 25
tion schedule is detailed in Section D.6. 26
After a test procedure has been executed, the test team will undertake evalua- 27
tion activities to assure that the test outcome was not the result of a false-positive or 28
false-negative condition. The test procedure status is then revised with the require- 29
ments management tool to reflect actual test results, such as full, partial, or failed 30
demonstration of compliance with the expected outcome, as defined in the test 31
procedure. 32
33
34
D.5.2 Test Program Metrics 35
Table D.5.1 shows the test progress metrics that will be collected and reported. The 36
quality assurance group will report on the quality metrics. 37
38
39 SHO
40 REG
41 LON
42 Appendix D Sample Test Plan

01 Table D.5.1 Test Program Metrics


02
Metric Name Description
03
04 Test procedure Number of executed test procedures versus total number of test
05 execution status procedures. This test procedure execution metric will indicate the
extent of the testing effort still outstanding.
06
Error discovery rate Number of total defects found versus number of test procedures
07 executed. The error discovery rate metric uses the same calculation as
08 the defect density metric. It is used to analyze and support a rational
09 product release decision.
10 Defect aging Date defect was opened versus date defect was fixed. The defect aging
11 metric provides an indication of turnaround of the defect.
12 Defect fix retest Date defect was fixed and released in new build versus date defect was
retested. The defect fix retest metric provides an idea of whether the
13
testing team is retesting the fixes fast enough to get an accurate
14 progress metric.
15 Defect trend analysis Number of total defects found versus number of test procedures
16 executed over time. Defect trend analysis can help determine the
17 trend of defects found. Is the trend improving as the testing phase is
18 winding down?
19 Problem reports Number of software problem reports broken down by priority. The
problem reports measure counts the number of software problems
20 reported, listing them by priority.
21
22
23
24 D.5.3 Defect Tracking
25
To track defects, a defect workflow process has been implemented. Defect workflow
26
training will be conducted for all test engineers. The steps in the defect workflow
27
process are as follows:
28
29 1. When a defect is generated initially, the status is set to “New.” (Note: How to
30 document the defect, what fields need to be filled in, and so on also need to be
31 specified.)
32 2. The tester selects the type of defect:
33
• Bug
34
35 • Cosmetic
36 • Enhancement
37 • Omission
38
3. The tester then selects the priority of the defect:
ORT 39
REG 40 • Critical—fatal error
ONG 41 • High—needs immediate attention
D.5 Test Execution 43

• Medium—needs to be resolved as soon as possible but not a showstopper 01


• Low—cosmetic error 02
03
4. A designated person (in some companies, the software manager; in other com-
04
panies, a special board) evaluates the defect and assigns a status and makes mod-
05
ifications of type of defect and/or priority if applicable).
06
The status “Open” is assigned if it is a valid defect. 07
The status “Close” is assigned if it is a duplicate defect or user error. The rea- 08
son for “closing” the defect needs to be documented. 09
10
The status “Deferred” is assigned if the defect will be addressed in a later
11
release.
12
The status “Enhancement” is assigned if the defect is an enhancement 13
requirement. 14
15
5. If the status is determined to be “Open,” the software manager (or other desig-
16
nated person) assigns the defect to the responsible person (developer) and sets
17
the status to “Assigned.”
18
6. Once the developer is working on the defect, the status can be set to “Work in 19
Progress.” 20
7. After the defect has been fixed, the developer documents the fix in the defect 21
tracking tool and sets the status to “fixed,” if it was fixed, or “Duplicate,” if the 22
defect is a duplication (specifying the duplicated defect). The status can also be 23
set to “As Designed,” if the function executes correctly. At the same time, the 24
developer reassigns the defect to the originator. 25
8. Once a new build is received with the implemented fix, the test engineer retests 26
the fix and other possible affected code. If the defect has been corrected with 27
the fix, the test engineer sets the status to “Close.” If the defect has not been 28
corrected with the fix, the test engineer sets the status to “Reopen.” 29
30
Defect correction is the responsibility of system developers; defect detection is 31
the responsibility of the AMSI test team. The test leads will manage the testing 32
process, but the defects will fall under the purview of the configuration management 33
group. When a software defect is identified during testing of the application, the 34
tester will notify system developers by entering the defect into the PVCS Tracker 35
tool and filling out the applicable information. 36
AMSI test engineers will add any attachments, such as a screen print, relevant to 37
the defect. The system developers will correct the problem in their facility and 38
implement the operational environment after the software has been baselined. This 39 SHO
release will be accompanied by notes that detail the defects corrected in this release 40 REG
41 LON
44 Appendix D Sample Test Plan

01 as well as any other areas that were changed as part of the release. Once imple-
02 mented, the test team will perform a regression test for each modified area.
03 The naming convention for attachments will be defect ID (yyy), plus Attx
04 (where x = 1, 2, 3. . . n) (for example, the first attachment for defect 123 should be
05 called 123Att1). If additional changes have been made other than those required for
06 previously specified software problem reports, they will be reviewed by the test man-
07 ager, who will evaluate the need for additional testing. If deemed necessary, the
08 manager will plan additional testing activities. He will have the responsibility for
09 tracking defect reports and ensuring that all reports are handled on a timely basis.
10
11
D.5.4 Configuration Management
12
13 The CM department is responsible for all CM activities and will verify that all parties
14 involved are following the defined CM procedures. System developers will provide
15 object code only for all application updates. It is expected that system developers
16 will baseline their code in a CM tool before each test release. The AMSI test team
17 will control the defect reporting process and monitor the delivery of associated pro-
18 gram fixes. This approach will allow the test team to verify that all defect conditions
19 have been properly addressed.
20
21
D.6 Detailed Test Schedule
22
23 A detailed SAT test schedule (portion of schedule) is provided in Table D.6.1.
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
ORT 39
REG 40
ONG 41
D.6 Detailed Test Schedule 45

Table D.6.1 Test Schedule 01


02
Task
03
ID Task Description Duration Start Finish
04
. . . . . 05
. . . . .
06
. . . . .
07
22 Develop SAT test responsibilities 1d 11/25 11/25
23 Develop review and reporting methods 1d 11/26 11/26 08
24 Develop management of test sessions 1d 11/27 11/27 09
25 Verify CM activities 1d 11/27 11/27 10
26 Verify change-control activities 1d 11/27 11/27 11
27 Develop issue/problem reporting standards 1d 11/30 11/30 12
28 Develop SAT test procedures 59d 12/12 2/12
29 Develop functional/usability test procedures 55d 12/12 2/8
13
30 Develop security test procedures 15d 12/22 1/7 14
31 Develop stress/volume test procedures 16d 1/7 1/23 15
32 Develop performance test procedures 14d 1/23 1/27 16
33 Develop system test shell procedures 3d 2/9 2/12 17
. . . . .
. . . . . 18
. . . . . 19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39 SHO
40 REG
41 LON
46 Appendix D Sample Test Plan

01 Appendix D.A Test Procedure


02 Development Guidelines
03
04 AMSI’s standard test procedures development guidelines for the WFTS project are
05 outlined below. These guidelines are available in the AMSI CM library.
06
07
08
09 Table D.A.1 Test Development Guidelines
10
Development
11
Guideline Topics Description
12
13 Design-to-development Specify how design and setup activities will be translated into
transition test development action
14
Reusable Test Procedures Test procedures need to be reusable for highest test program
15 return on investment
16 Data Avoid hard-coding data values into scripts, rendering them
17 not reusable
18 Application navigation Standard navigation method needs to be deployed for
19 reusable test scripts
Bitmap image recording Addresses the use of bitmap image recording method for test
20
procedure development
21 Automation wildcards Development guidelines supporting reusable test procedures
22 Capture/playback Outlines on how to apply the use of capture/playback
23 recording
24 Maintainable Test Procedures Test procedures whose defects are easy to remove and can
25 easily be adapted to meet new requirements
Cosmetic standards Standards defined to promote test program code that is easy
26
to read and comprehend
27 Test script comments Specifies where and how comments are used within
28 procedures and scripts
29 Test script documentation Specifies that test script documentation is important for test
30 procedure maintainability
31 Test/application How to synchronize server/GUI/AUT with test script
synchronization
32 Test procedure index Guidelines supporting the maintenance of an index to find
33 test procedures of interest
34 Error handling Guidelines for how test procedures will handle errors
35 Naming standards Defines standard naming convention for test procedures
36 Modularity Guidelines for creating modular test scripts
Looping constructs Looping constructs support script modularity
37
Branching constructs Branching constructs support script modularity
38 Context independence Directs development of test procedures given test procedure
ORT 39 relationships
REG 40 continued
ONG 41
Appendix D.A Test Procedure Development Guidelines 47

continued from page 534 01


02
Development 03
Guideline Topics Description 04
Global files Globally declared functions are available to any procedure and 05
support maintainability 06
Constants Guidelines addressing use of constants to support
07
maintainable test procedures
Other Guidelines Other test development guidelines
08
Output format Users need to define the desired appearance of the test 09
procedure results output 10
Test procedures/verification Guidelines can specify which test procedure to use most often 11
points and which ones to avoid 12
User-defined verification Addresses the use of script programming for user-defined
13
verification
API calls, dynamic link Addresses test automation using APIs and .dlls as part of the 14
libraries (.dll) user-defined verification methods 15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39 SHO
40 REG
41 LON
48 Appendix D Sample Test Plan

01 Appendix D.B Test Verification Summary Matrix


02
A description of the columns contained within the test verification summary matrix
03
is provided in Table D.B.1, and the actual test verification summary matrix for
04
WFTS Delivery 2 is provided in Table D.B.2. The test verification summary matrix
05
represents an example of the type of requirements traceability matrix that can be
06
generated using DOORS. This matrix links the test procedures to test requirements,
07
enabling the test team to verify the test coverage.
08
09
10
11
12 Table D.B.1 Test Verification Summary Matrix Terminology
13
Column Title Description
14
15 Para ID The paragraph number of the particular requirement from the WFTS
16 system specification document
17 Text The text of the requirement statement
Key The unique requirement identification number generated by the
18
requirements management tool for that requirement statement
19 Method The verification (qualification) method to be used to verify that the
20 requirement has been satisfied by the system solution
21 Pri Identifies the priority of the requirement: CR = critical, HR = high risk, PM
22 = technical performance measure, NN = noncritical
23 D1/D2/D3 Identifies the system delivery (either D1, D2, or D3) in which the solution
24 to the requirement has been implemented
25 Test Procedure Identifies the test procedure that exercises a test of the requirement
26
27
28
29
30
31
32
33
34
35
36
37
38
ORT 39
REG 40
ONG 41
Appendix D.B Test Verification Summary Matrix 49

Table D.B.2 Test Verification Summary Matrix 01


02
Para Test
03
ID Text Key Method Pri D1 D2 D3 Procedure
04
3.2.1a System shall perform 178 Test NN D1 — — SM2012 05
software installation 06
and upgrades
07
3.2.1b System shall perform 179 Test NN — D2 — SM2013
software system load 08
balancing for WFTS 09
system servers 10
3.2.1c System shall perform 180 Test HR — D2 — SM2014 11
a recovery of the 12
system and data in
the event of a system
13
failure 14
3.2.1d System shall manage 181 Test NN — D2 — SM2015 15
disk and file structure 16
and allocation, 17
including the ability
18
to determine the
amount of disk space 19
used and available 20
3.2.1e System shall be able 182 Test NN D1 — — SM2016 21
to configure electronic 22
mail and manage 23
directory service
capabilities
24
3.2.1f System shall monitor 183 Test NN — D2 — SM2017 25
the software 26
configuration of 27
critical system 28
components and
29
workstations, including
checks for outdated 30
versions 31
. . . . . . . . . 32
. . . . . . . . . 33
. . . . . . . . .
34
35
36
37
38
39 SHO
40 REG
41 LON
50 Appendix D Sample Test Plan

01 Appendix D.C Test Procedures and Test Scripts


02
Manual test procedures supporting SAT are documented within the DOORS data-
03
base. Automated test procedures and test scripts supporting SAT are maintained
04
using the TeamTest test tool.
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
ORT 39
REG 40
ONG 41

You might also like