Professional Documents
Culture Documents
02
03
04
05
06
Sample 07
08
09
Test Plan 10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39 SHO
40 REG
41 LON
1
2 Appendix D Sample Test Plan
3
4 Appendix D Sample Test Plan
D.1 Introduction 01
02
D.1.1 Purpose
03
This test plan will outline and define the strategy and approach taken to perform 04
testing on the WallStreet Financial Trading System (WFTS) project. It is intended 05
for use by WFTS project personnel in understanding and carrying out prescribed 06
test activities and in managing these activities through successful completion. This 07
document defines the details of test responsibilities and activities and describes the 08
tests to be conducted. 09
This test plan has been developed to fulfill the following objectives: 10
11
• To lay out the management and technical effort necessary to support testing
12
throughout the system development life cycle
13
• To establish a comprehensive test plan that identifies the nature and extent 14
of tests deemed necessary to achieve the testing objectives for the WFTS 15
project, including software and hardware requirements. 16
• To coordinate an orderly schedule of events, identify equipment and orga- 17
nizational requirements, describe test methodologies and strategies to be 18
used, and identify items to be delivered 19
• To provide a plan that outlines the contents of detailed test procedure 20
scripts and the execution of those test procedure scripts (that is, which test- 21
ing techniques will be used) 22
23
To help standardize the test effort and make it more efficient, test procedure 24
development guidelines are provided in Appendix D.A. These guidelines have been 25
adopted and are being implemented by the AMSI test team for the WFTS project. 26
The test team will take advantage of testing tools to help improve and streamline the 27
testing process. For further detail on the test strategy, see Section D.3.3 of this plan. 28
Test procedures are identified and tracked using the Dynamic Object-Oriented 29
Requirements Management System (DOORS) requirements management tool. 30
This approach will allow for easy management of test progress status. Once a test is 31
performed, the test procedure status is revised within DOORS to reflect actual test 32
results, such as pass/fail. Appendix D.B provides a test verification summary matrix 33
that is generated using DOORS; it links the test procedures to test requirements so 34
as to measure test coverage. Test procedures and test scripts supporting system 35
acceptance test (SAT) are provided in Appendix D.C. 36
37
D.1.2 Background 38
39 SHO
The WFTS project was initiated in response to management’s recognition of the 40 REG
need for improvement within the service management operations at Financial 41 LON
6 Appendix D Sample Test Plan
01
Active Trading Financial Portfolio
Asset Trading
Forecast and 02
Visibility Management Decision Support
03
04
Support Applications 05
06
Application Platform Cross-Functional 07
08
User Data Data System Security Distributed
Interface Management Interchange Network Management Guard Computing 09
10
11
Operating System 12
13
Figure D.1.1 WFTS Delivery 2 Software Architecture 14
15
16
17
Table D.1.1 WFTS Software Components 18
19
ID NDI/ 20
Number Description DI COTS D1 D2 21
22
OS-01 Operating system — COTS D1 —
23
UI-02 User interface — COTS D1 D2 24
DM-03 Data management DI — D1 D2 25
DI-04 Data interchange DI — D1 D2 26
NW-05 Network — COTS D1 — 27
SM-06 System management 20% 80% D1 D2 28
29
SG-07 Security guard — COTS — D2
30
DC-08 Distribution computing 30% 70% D1 D2 31
SA-09 Support applications 80% 20% — D2 32
TV-10 Active trade visibility 25% 75% — D2 33
FP-11 Financial portfolio management 20% 80% — D2 34
AT-12 Asset trading DI — — D2 35
36
DS-13 Forecasts and decision support DI — — D2
37
38
39 SHO
40 REG
41 LON
8 Appendix D Sample Test Plan
01
Aug Sept Oct Nov Dec Jan Feb Mar Apr
02
03
04
Test Plan Integration Test Phase SAT 05
06
07
Unit Test Phase System Test Phase
Site 1 Site 2 08
Installation Installation
09
Code System
Walkthrough Walkthrough 10
Security Test 11
Test Procedure Script Development 12
13
14
15
Figure D.1.2 Test Program Milestone Schedule
16
17
18
The major events, activities, and documentation to be performed or prepared in 19
support of the WFTS test program are outlined in the test program milestone 20
schedule depicted in Figure D.1.2. 21
22
23
D.2 Roles and Responsibilities 24
Roles and responsibilities of the various groups are defined in this section. 25
26
27
D.2.1 Project Organization
28
Figure D.2.1 depicts the WFTS project organization. Reporting to the WFTS 29
project manager are four line supervisors: the software development manager, the 30
systems engineering manager, the product assurance manager, and the functional 31
requirements manager. The software development manager is responsible for soft- 32
ware and database design and development, as well as unit- and integration-level 33
software tests. The systems engineering manager leads the system architecture 34
design effort and is responsible for new COTS product evaluations. This manager 35
maintains the network that supports the system development and test environ- 36
ments, and is responsible for database administration of the deployed Delivery 1 37
WFTS system. The product assurance manager is responsible for test, configuration 38
management, and quality assurance activities. 39 SHO
The test manager is responsible for system test and user acceptance test activities 40 REG
supporting the WFTS system. The functional requirements manager is responsible 41 LON
12 Appendix D Sample Test Plan
01
Project
02 Manager
03
04
05
06 Software Systems Product Functional
Development Engineering Assurance Requirements
07 Manager Manager Manager Manager
08
09
10
11 Test Manager CM Manager QA Manager
12
13 Figure D.2.1 WFTS Project Organization
14
15
16 for requirements analysis, system requirements specification, and maintenance of
17 the requirements baseline. Functional analyst personnel also support development
18 and review of detailed design activities.
19
20
21 D.2.2 Project Roles and Responsibilities
22 D.2.2.1 Project Management
23
The project manager is responsible for client relations, project deliverables, sched-
24
ules, and cost accounting. He or she coordinates with the particular line manager
25
with regard to each technical task performed. The staff of project management spe-
26
cialists maintain project plans, schedules, and cost accounting information. Project
27
management is responsible for ensuring that standards and procedures are followed
28
and implemented appropriately.
29
30
D.2.2.2 Functional Requirements
31
32 The requirements group is responsible for requirements analysis and system require-
33 ments specification and for the derivation of subsequent use cases. This group also
34 supports development and review of detailed design activities.
35
36 D.2.2.3 Software Development
37 The software development group is responsible for software development, as well as
38 unit and integration software tests. It must develop software products in accordance
ORT 39 with software development standards and conventions as specified in the software
REG 40 development plan (SDP). The software development group also performs unit and
ONG 41
D.2 Roles and Responsibilities 13
integration test phase planning. The results of unit and integration test phase plan- 01
ning are then provided as input to Section D.3 of the test plan. 02
For software development items, each developer will maintain a systems devel- 03
opment folder (SDF) that contains the design documentation, printed copies of 04
lines of code and user screens generated, development status of the item, and test 05
results applicable to the item. 06
Test support responsibilities of the software development group include those 07
described here. 08
09
Software Product Design and Development. When designing and developing any 10
software or database product, the developer will comply with the software develop- 11
ment standards and conventions specified in the SDP. Certain SDP provisions are 12
automatically enforceable, such as the use of system development folders and com- 13
pliance with the procedures associated with the use of the product development 14
reuse library. Testability will be incorporated into the software as defined in the SDP. 15
The third-party controls (widgets) defined for the development of this system must 16
comply with the list of third-party controls that are compatible with the automated 17
testing tool. The test team will be informed of peer reviews and code walkthroughs 18
initiated by the development team. 19
20
Development Documentation. The development team will maintain SDFs. Embed- 21
ded within the lines of programming code will be documentation in the form of com- 22
ments. The embedded comments facilitate understanding of software structure and 23
define the purpose of software routines. They will trace or correlate to pseudocode so 24
as to facilitate software design traceability from the actual source code to the design 25
document. 26
27
Unit Test Phase. Developers will test individual software units with respect to their 28
function and integrity. Software unit program code will be analyzed to ensure that 29
the code corresponds to functional requirements. Tracing tools will minimize code 30
volume and eradicate dead code. Memory leakage tools will be applied, and code 31
coverage tools will be used to verify that all paths have been tested. The system test 32
team will perform unit testing in accordance with AMSI standards and procedures. 33
34
Integration Test Phase. Integration testing will be conducted to demonstrate the 35
consistency between the software design and its implementation in accordance with 36
AMSI standards and procedures. Its results will be recorded in the SDFs and 37
inspected for software quality assurance. When software modules are ready to sup- 38
port the integration and system test phases, the source code and all files required 39 SHO
for proper generation of the executables will be baselined within the software 40 REG
41 LON
14 Appendix D Sample Test Plan
01 configuration management tool. Each software build will be generated using the
02 source code products maintained within the software configuration management
03 tool. The system test team will perform integration testing and verify completeness
04 according to integration test procedures.
05 The software development group is also responsible for database design and
06 development and all data migration and synchronization activities. Additionally, it
07 helps the test group in setting up a test environment. The database group develops
08 the database in accordance with database development standards and conventions as
09 specified in the SDP.
10
11 D.2.2.4 Systems Engineering
12 The systems engineering group is responsible for the development of the system
13 architecture design, integration of COTS products, research of COTS products,
14 and evaluation of COTS products. As part of COTS integration, the systems engi-
15 neering group will be responsible for the design and development of software mod-
16 ules as well as testing of the integrated COTS products. The systems engineering
17 group will develop and maintain a simulation model of the WFTS using the
18 OPNET simulation tool. The WFTS simulation model will simulate the major func-
19 tions of the system and provide information on bottlenecks and queue buildups.
20 The systems engineering group maintains the network and hardware that work
21 with the system development and test environments, and is responsible for database
22 and system security administration of the deployed Delivery 1 WFTS system. The
23 group installs and configures COTS products as required to integrate them with the
24 rest of the system. The necessary parameters are defined for COTS products by this
25 group and then set to work in the target environment. Hardware is installed and
26 configured to reflect a typical end-user site. Upon receipt of the new system equip-
27 ment that is destined for deployment at an end-user site, the appropriate hardware
28 and system software configurations are installed.
29
30 D.2.2.5 Product Assurance
31
The product assurance group implements test, configuration, and quality assurance
32
activities. The system test team performs various test activities supporting the WFTS
33
system by following the ATLM. It takes responsibility for system test and user
34
acceptance test activities supporting the WFTS system; it also carries out the unit
35
and integration test phases described in Section D.3.
36
The system test team develops the test plan and procedures, and it performs the
37
tests necessary to ensure compliance with functional, performance, and other tech-
38
nical requirements. Test program activities include the maintenance of test automa-
ORT 39
tion reuse libraries, planning and execution of tests, and the development of test
REG 40
reports. These responsibilities are detailed below.
ONG 41
D.2 Roles and Responsibilities 15
02
03 Number Work Breakdown Structure (WBS) Element
04 6.7 Critical success functions. Work with project team and business users to identify
05 critical success functions and document them within the test plan.
6.8 Test program parameters. Define test program parameters such as assumptions,
06
prerequisite activities, system acceptance criteria, and test program risks and
07 document them within the test plan.
08 6.9 Level of quality. Work with project team and business users to determine the level
09 of quality for the project and document it within the test plan.
10 6.10 Test process. Document the test process within the test plan, including the test
11 tool introduction process and the defect management process.
6.11 Test training. Document test training requirements and plans within the test plan.
12 6.12 Decision to automate test. Document the assessment outlining the benefit of
13 using an automated test tool on the project, and the ability to incorporate an
14 automated test tool given the project schedule.
15 6.13 Technical environment. Document the technical environment in which the AUT
16 will be developed and eventually operate. Identify potential application design
or technical automated testing tool issues that may need to be resolved.
17
6.14 Test tool compatibility check. Document results of the test tool compatibility
18 check. Where an incompatibility problem arises, document work-around
19 solutions and alternative test methods.
20 6.15 Quality gates. Plan for the incorporation of quality gates.
21 6.16 Risk assessments. Perform risk assessments in support of project management
22 reviews and reporting requirements.
6.17 Test readiness reviews. Perform planning and analysis activities necessary for test
23 readiness reviews. Develop presentation slides and perform presentations
24 where required.
25 6.18 Test plan document. Assemble and package the test-planning documentation into
26 a test plan. Incorporate changes to the test plan as a result of test plan reviews
27 by project management and end users or customers. Maintain the test plan
document throughout the test life cycle.
28
6.19 Test data. Document test data requirements and plans for developing and
29 maintaining a test data repository.
30 6.20 Test environment. Identify requirements for a test laboratory or test environment
31 and identify the personnel who are responsible for setting up and maintaining
32 this environment.
33 6.21 Reporting requirements. Define reporting requirements and document them
within the test plan.
34 6.22 Roles and responsibilities. Define and document roles and responsibilities for the
35 test effort.
36 6.23 Test tool system administration. Outline the requirements for setting up and
37 maintaining the automated test tools and environment, and identify the
38 personnel who are responsible for setting up and maintaining the test tools.
Administration includes setup of tool users and various privilege groups.
ORT 39
continued
REG 40
ONG 41
D.2 Roles and Responsibilities 19
02
03 Number Work Breakdown Structure (WBS) Element
04 8.7.4 Acceptance test phase test procedures/scripts. Develop and maintain test
05 procedures and scripts.
8.8 Coordination with the database group to develop test database environment.
06
Baseline and maintain test data to support test execution.
07 8.9 Test procedure peer reviews. Review test procedures against the script creation
08 standards (comments for each test tool scripting step, header file information,
09 modularity, and so on).
10 8.10 Reuse library. Develop and maintain a test procedure reuse library for the project.
11 8.11 Test utilities. Support the creation or modification of in-house test support
utilities that improve test effort efficiency.
12
13 9 Test Execution
14
9.1 Environment setup. Develop environment setup scripts.
15 9.2 Testbed Environment. Develop testbed scripts and perform testbed development
16 logistics.
17 9.3 System test phase execution. Execute test procedures as part of walkthroughs or
18 test demonstrations.
19 9.4 Acceptance test phase execution. Execute test procedures as part of walkthroughs
or test demonstrations.
20 9.5 Test reporting. Prepare test reports.
21 9.6 Issue resolution. Resolve daily issues regarding automated test tool problems.
22 9.7 Test repository maintenance. Perform test tool database backup/repair and
23 troubleshooting activities.
24
10 Test Management and Support
25
26 10.1 Process reviews. Perform a test process review to ensure that standards and the
27 test process are being followed.
10.2 Special training. Seek out training for test engineers for special niche test
28 requirements that become apparent during the test life cycle. Continue to
29 develop technical skills of test personnel.
30 10.3 Testbed configuration management (CM). Maintain the entire testbed/repository
31 (that is, test data, test procedures and scripts, software problem reports) in a
32 CM tool. Define the test script CM process and ensure that test personnel
work closely with the CM group.
33
10.4 Test program status reporting. Identify mechanisms for tracking test program
34 progress. Develop periodic reports on test progress. Reports should reflect
35 estimates to complete tasks in progress.
36 10.5 Defect management. Perform defect tracking and reporting. Attend defect
37 review meetings.
38 10.6 Metrics collection and analysis. Collect and review metrics to determine whether
changes in the process are required and to determine whether the product is
ORT 39
ready to be shipped.
REG 40
continued
ONG 41
D.3 Test Program 21
In addition to identifying test goals and objectives, the test team documented 01
test program parameters, including its assumptions, prerequisites, system acceptance 02
criteria, and risks. 03
04
D.3.2.1 Assumptions 05
The test team developed this plan with the understanding of several assumptions 06
concerning the execution of the WFTS project and the associated effect on the test 07
program. 08
09
Test Performance. The test team will perform all tests on the WFTS project with 10
the exception of those unit and integration phase tests, which are performed by the 11
system developers and witnessed by the system test group. 12
13
Security Testing. System security tests, designed to satisfy the security test 14
requirements outlined within the security test plan, will be executed during system 15
testing and will be incorporated into the test procedure set constituting the system 16
acceptance test (SAT). 17
18
Early Involvement. The test team will be involved with the WFTS application 19
development effort from the beginning of the project, consistent with the ATLM. 20
Early involvement includes the review of requirement statements and use cases/use 21
case scenarios and the performance of inspections and walkthroughs. 22
23
Systems Engineering Environment. The suite of automated tools and the test 24
environment configuration outlined within this plan are based upon existing sys- 25
tems engineering environment plans outlined within the WFTS management plan 26
and the software development plan. Changes in the systems engineering environ- 27
ment will require subsequent and potentially significant changes to this plan. 28
29
Test Team Composition. The test team will include three business area functional 30
analysts. These analysts will be applied to the system test effort according to their 31
functional area expertise. While these analysts are on loan to the test group, they will 32
report to the test manager regarding test tasks and be committed to the test effort. 33
They will support the test effort for the phases and percentages of their time as 34
noted in section D.2.4. 35
36
Test Limitations. Given the resource limitations of the test program and the limit- 37
less number of test paths and possible input values, the test effort has been designed to 38
focus effort and attention on the most critical and high-risk functions of the system. 39 SHO
Defect tracking and its associated verification effort, likewise, focus on assessing 40 REG
41 LON
26 Appendix D Sample Test Plan
01 these functions and meeting acceptance criteria, so as to determine when the AUT
02 is ready to go into production.
03
04 Project Schedule. Test resources defined within the test plan are based upon the
05 current WFTS project schedule and requirement baseline. Changes to this baseline
06 will require subsequent changes to this plan.
07
08 D.3.2.2 Test Prerequisites
09 The WFTS test program schedule depicted in Figure D.1.2 includes the conduct of
10 a system walkthrough. This walkthrough involves a demonstration that system test
11 procedures are ready to support user acceptance testing.
12 The conduct of this walkthrough and subsequent performance of SAT requires
13 that certain prerequisites be in place. These prerequisites may include activities,
14 events, documentation, and products. The prerequisites for the WFTS test program
15 execution are as follows:
16
17 • The full test environment configuration is in place, operational, and under
18 CM control.
19 • The test data environment has been established and baselined.
20 • All detailed unit and integration test requirements have been successfully
21 exercised as part of the unit and integration test phases.
22
• Materials supporting test-by-inspection and certification methods are on
23
hand. Materials representing evidence of test-by-analysis are on hand.
24
25 • The system test procedure execution schedule is in place.
26 • Automated test procedure reuse analysis has been conducted.
27 • A modularity-relationship model has been created.
28
• System test procedures have been developed in accordance with standards.
29
30 • The WFTS system baseline software has been installed in the test environ-
31 ment and is operational.
32
33 D.3.2.3 System Acceptance Criteria
34 The WFTS test program within the AMSI test environment concludes with the sat-
35 isfaction of the following criteria. In accordance with the test schedule depicted in
36 Figure D.1.2, two site acceptance tests are performed following completion of these
37 criteria.
38
• SAT has been performed.
ORT 39
REG 40 • Priority 1–3 software problem reports reported during SAT and priority
ONG 41 2–3 software problem reports that existed prior to SAT have been resolved.
D.3 Test Program 27
The test group has verified the system corrections implemented to resolve 01
these defects. 02
• A follow-up SAT has been conducted, when required, to review test proce- 03
dures associated with outstanding priority 1–3 software problem reports. 04
Successful closure of these software problem reports has been demon- 05
strated. 06
07
• A final test report has been developed by the test team and approved by
08
FTC.
09
10
D.3.2.4 Risks
11
Risks to the test program (see Table D.3.2) have been identified, assessed for their 12
potential effects, and then mitigated with a strategy for overcoming the risk should 13
it be realized. 14
15
D.3.3 Test Strategies 16
17
Drawing on the defined test goals and objectives and using the ATLM as a baseline, 18
the test team defined the test strategies that will be applied to support the WFTS 19
test program. The test team will utilize both defect prevention and defect removal 20
technologies as shown in Table D.3.3. 21
The AMSI test team will execute the SAT. It will develop test threads to exercise 22
the requirements specified in the detailed requirements/use case documents. The 23
test procedures will specify how a test engineer should execute the test by defining 24
the input requirements and the anticipated results. The detail of this information is 25
controlled through the DOORS test tool and is available on-line. The DOORS 26
database serves as the repository for system requirements and test requirements. 27
The DOORS requirements management tool is used for managing all systems 28
requirements, including business, functional, and design requirements. It is also 29
used for capturing test requirements and test procedures, thus allowing for simple 30
management of the testing process. Using the DOORS scripting language and the 31
associated .dxl files, the test team can automatically create a traceability matrix that 32
will measure the coverage progress of test procedures per test requirements. In turn, 33
test procedures will be derived from the detailed business requirements and use 34
cases and stored in the DOORS database. 35
The highest-risk functionality has been identified, and the test effort will focus 36
on this functionality. Reuse analysis will be conducted of existing test procedures 37
to avoid rework of automated test procedures available from previous testing 38
efforts. If the automated test tool is not compatible with some of the functionality 39 SHO
and no feasible automation work-around solutions can be found, tests will be exe- 40 REG
cuted manually. 41 LON
28 Appendix D Sample Test Plan
01
Test Architecture Development Test Level 02
03
SM-06 SG-07 DC-08 SA-09 TV-10
Error Error Error Error Error 04
Handling Handling Handling Handling Handling 05
Memory Leak Memory Leak Memory Leak Memory Leak Memory Leak 06
Path Coverage Path Coverage Path Coverage 07
Fault Fault Fault 08
Insertion Insertion Insertion
Decision Decision Decision
09
Coverage Coverage Coverage 10
11
System Test Level 12
13
Functional Security Stress/ Volume Performance Usability
SM-06 SM-06 TV-10 TV-10 SM-06 14
SG-07 SG-07 SG-07 15
DC-08 and DC-08 16
SA-09 Security Plan SA-09 17
TV-10 Requirements TV-10 18
19
Figure D.3.2 Sample Test Architecture 20
21
22
23
ity (TV-10). For each of these design components, the test techniques that will be 24
applied are noted. 25
26
D.3.7.3 Test Procedure Definition 27
A preliminary step in the test design process involves the development of the test 28
procedure definition, which aids in test development and helps to bound the test 29
effort. The test procedure definition identifies the suite of test procedures that must 30
be developed and executed for the test effort. The design exercise involves the orga- 31
nization of test procedures into logical groups and the allocation of test procedure 32
number series for each set of tests required. 33
Table D.3.5 depicts a sample test procedure definition for development-level 34
tests. Column 1 of this table identifies the series of test procedure numbers allotted 35
for testing of the particular design component using the particular technique. Col- 36
umn 2 lists the software or hardware design components to be tested. The design 37
components referenced are retrieved from the test architecture. The test technique 38
is listed in column 3, and the number of test procedures involved in each set of tests 39 SHO
(row) is estimated in column 4. 40 REG
41 LON
34 Appendix D Sample Test Plan
Configuration 01
Management 02
Automation 03
Infrastructure 04
Develop Automated
05
Peer Review
Test Procedures 06
Calibration of the Test Tool Compatibility
07
Test Tool Work-Around Solutions 08
09
Test Procedure Modularity Relationship Automated Testing Tool
Execution Schedule Analysis User Manual 10
11
Manual Test Procedures Test Design Test Development Automation
(Test Plan) Standards Standards Reuse Analysis 12
13
Technical Environment Environment Readiness
Facilities and Hardware Checks 14
15
Figure D.3.3 Test Development Architecture 16
17
Table D.3.9 Automation Reuse Analysis 18
19
TP Design Test SR SWR TR Reuse 20
Number Component Technique ID ID ID A/M Asset 21
2330 TV1016 Functional 3.2.3c TV029 2220 A — 22
2331 TV1016 Functional 3.2.3c TV030 2221 A MMS2079 23
2332 TV1016 Functional 3.2.3c TV031 2412 M — 24
2333 TV1017 Functional 3.2.3d TV032 2222 A — 25
2334 TV1017 Functional 3.2.3d TV033 2412 M —
26
2335 TV1018 Functional 3.2.3e TV034 2223 A LW2862
2336 TV1018 Functional 3.2.3e TV035 2412 M — 27
2337 TV1019 Functional 3.2.3f TV036 2224 A — 28
2338 TV1019 Functional 3.2.3g TV037 2225 A ST2091 29
2339 TV1019 Functional 3.2.3g TV038 2226 A ST2092 30
. . . . . . . .
. . . . . . . .
31
. . . . . . . . 32
33
34
cedure execution schedule and the modularity-relationship model. Test develop- 35
ment must be consistent with the test development guidelines provided in Appendix 36
D.A. Additionally, test procedures will be developed using the automatic test proce- 37
dure generation feature of the TestStudio test tool. 38
The test team prepared a test development architecture, depicted in Figure 39 SHO
D.3.3, that provides a clear picture of the test development activities (building 40 REG
41 LON
38 Appendix D Sample Test Plan
01 blocks) necessary to create test procedures. The test development architecture illus-
02 trates the major activities to be performed as part of test development.
03 To conduct its test development activities efficiently, the test team performed an
04 analysis to identify the potential for reuse of existing test procedures and scripts
05 within the AMSI automation infrastructure (reuse library). The results of this reuse
06 analysis are maintained using the DOORS tool and are depicted in Table D.3.9.
07
08
09
D.4 Test Environment
10 D.4.1 Test Environment Configuration
11
The test environment mirrors the production environment. This section describes
12
the hardware and software configurations that compose the system test environ-
13
ment. The hardware must be sufficient to ensure complete functionality of the soft-
14
ware. Also, it should support performance analysis aimed at demonstrating field
15
performance. Information concerning the test environment pertinent to the appli-
16
cation, database, application server, and network is provided below.
17
18
Application
19
20 Visual Basic 5.0
21
Iona’s Orbix V2.3
22
23 Microsoft’s Internet Information Server
24 Neonet V3.1
25 MQ Series V.20
26
Windows NT V4.0 service pack 3
27
28
Application Server
29
30 Dual-processor PC, 200MHz Pentium processors
31
256MB Memory
32
33 4–6GB hard disk, CD-ROM drive
34 2 Syngoma 503E SNA boards
35 Microsoft SNA Server 3.0
36
Digital DCE 1.1C with Eco patch
37
38 Encina 2.5 with patches
ORT 39 Windows NT 4.0 with service pack 3
REG 40
ONG 41
D.4 Test Environment 39
Database 01
02
Sybase 11 Server V11.x.1 application server 03
Microsoft’s SNA Server V4.0 04
Digital DCE Client and Server with Eco patch V1.1c 05
06
Encina V2.5 with patches
07
Workstation 08
Windows NT V4.0 service pack 3 09
Iona’s Orbix V2.3 10
11
Sybase Configuration 12
13
Application: Sybase 11 Open Client CT-Lib V11.1.0 14
Database: Sybase 11 Server V11.x.1 15
16
Sun Solaris for the database server
17
18
Network Configuration
19
Ethernet switched network 20
21
Baseline test laboratory equipment for WFTS central site configurations was 22
acquired for development and testing performed in support of Deliver 1 WFTS 23
system. Delivery 2 requirements involve additional functionality, and as a result of 24
the scope of the test effort must be modified accordingly. Two site configurations 25
must be added to the WFTS test lab configuration. The procurement of addi- 26
tional hardware and software resources is reflected in the test equipment list given in 27
Table D.4.1. 28
29
30
D.4.2 Test Data 31
Working in conjunction with the database group, the test team will create the test 32
database. The test database will be populated with unclassified production data. The 33
configuration management group will baseline the test environment, including the 34
test database. Additionally, during performance testing, test data will be generated 35
using Rational’s Performance Studio tool. These data will be baselined in the PVCS 36
configuration management tool. To assure adequate testing depth (volume of test 37
database of 10 records versus 10,000 records), the test team will mirror the 38
production-size database during performance testing. To assure adequate testing 39 SHO
40 REG
41 LON
40 Appendix D Sample Test Plan
breadth (variation of data values), it will use data with many variations, again mir- 01
roring the production data environment. Test data will use the procedure data defi- 02
nitions, whenever possible. 03
Table D.4.2 is a matrix that cross-references test data requirements to each indi- 04
vidual test procedure that is planned for system testing. 05
06
07
D.5 Test Execution 08
D.5.1 Test Program Reporting 09
10
An earned value management system will be used to track test program progress,
11
including cost and schedule measures. Earned value involves tracking of the value of
12
completed work relative to planned costs and actual costs, so as to provide a true
13
measure of cost status and to enable AMSI’s personnel to define effective corrective
14
actions. Four primary steps make up the earned value process:
15
1. Identify short tasks (functional test phase). 16
2. Schedule each task (task start date and end date). 17
18
3. Assign a budget to each task (task will require 3,100 hours using four test engi-
19
neers).
20
4. Measure the progress of each task (schedule and cost variance). 21
The primary tasks to be performed by the test team have been identified consis- 22
tent with the work breakdown structure outlined in Table D.2.1. A detailed test 23
schedule has been prepared identifying each task. For each task, timeframes have 24
been determined and hours and personnel have been allocated. The SAT test execu- 25
tion schedule is detailed in Section D.6. 26
After a test procedure has been executed, the test team will undertake evalua- 27
tion activities to assure that the test outcome was not the result of a false-positive or 28
false-negative condition. The test procedure status is then revised with the require- 29
ments management tool to reflect actual test results, such as full, partial, or failed 30
demonstration of compliance with the expected outcome, as defined in the test 31
procedure. 32
33
34
D.5.2 Test Program Metrics 35
Table D.5.1 shows the test progress metrics that will be collected and reported. The 36
quality assurance group will report on the quality metrics. 37
38
39 SHO
40 REG
41 LON
42 Appendix D Sample Test Plan
01 as well as any other areas that were changed as part of the release. Once imple-
02 mented, the test team will perform a regression test for each modified area.
03 The naming convention for attachments will be defect ID (yyy), plus Attx
04 (where x = 1, 2, 3. . . n) (for example, the first attachment for defect 123 should be
05 called 123Att1). If additional changes have been made other than those required for
06 previously specified software problem reports, they will be reviewed by the test man-
07 ager, who will evaluate the need for additional testing. If deemed necessary, the
08 manager will plan additional testing activities. He will have the responsibility for
09 tracking defect reports and ensuring that all reports are handled on a timely basis.
10
11
D.5.4 Configuration Management
12
13 The CM department is responsible for all CM activities and will verify that all parties
14 involved are following the defined CM procedures. System developers will provide
15 object code only for all application updates. It is expected that system developers
16 will baseline their code in a CM tool before each test release. The AMSI test team
17 will control the defect reporting process and monitor the delivery of associated pro-
18 gram fixes. This approach will allow the test team to verify that all defect conditions
19 have been properly addressed.
20
21
D.6 Detailed Test Schedule
22
23 A detailed SAT test schedule (portion of schedule) is provided in Table D.6.1.
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
ORT 39
REG 40
ONG 41
D.6 Detailed Test Schedule 45