You are on page 1of 15

Perf or m a n c e Te s t Stra t e g y

For
XXXXXXX Pha s e 1a

DRAFT

TABLE OF CONTENTS
REVISION HISTORY ..................................................................................................3
1

INTRODUCTION ...................................................................................................5

TEST OBJECTIVES ...............................................................................................6

TEST SCOPE .........................................................................................................7


3.1 PERFORMANCE TEST SCOPE ...............................................................7
3.1.1 Goals of Perfor m a n c e Tests ...................................................................7
3.2 P ERFORMANCE TESTING S CENARIOS ....................................................8
3.2E LEMENTS OF W ORK IN P ERFORMANCE TEST S COPE ................................10
3.4 E LEMENTS OF W ORK NOT IN P ERFORMANCE TEST S COPE ......................10

4.

TEST APPROACH ............................................................................................10


4.1P ERFORMANCE TEST APPROACH ..........................................................11
4.1.1 Stag e 1: Test Strat e g y Develop m e n t ..........................................11
4.1.2 Stag e 2: Require m e n t s Revie w ......................................................11
4.1.3 Stag e 3: Test Script Design and Develop m e n t ........................11
4.1.4 Stag e 4: Test Script Execu tion .......................................................12

5.

TEST DELIVERABLES ..................................................................................12

5.1
5.2
6.

P ROJECT TEST D ELIVERABLES ...........................................................12


P ERFORMANCE TEST D ELIVERABLES ..................................................12

TEST ENVIRONMENT REQUIREMENTS ...............................................12


6.1C ONFIGURATION ...............................................................................13
6.2T EST D ATA.....................................................................................13

7.

ASSUMPTION S , DEPENDENCIES, RISKS ...........................................13


7.1
7.2
7.3

8.

ASSUMPTIONS ...............................................................................13
D EPENDENCIES .............................................................................14
R ISKS.........................................................................................14

PERFORMANCE TEST WORK PLAN ......................................................14

Revision History
Each time this document is modified, document the date, the persons name making the change and a
brief description of the change.
Date
9/10/2001
11/13/ 2 0 0
1

Author

Description of Change
Draft
Modified Script Nam e s
Goals.

& Scenarios and Test

INTRODUCTION

This document details the performance test strategy of the XXXXX system project. This document
includes information about the tests objectives, scope, and approach. In addition, it provides a list of
the roles and responsibilities for personnel participating on the test, as well as operational procedures
that will be used during the test and a description of the test environment requirements.
The document sections are organized as follows:

Test Objectives describes the overall test objectives.

Test Scope describes the overall approach to testing. The approach provides a list of
testing deliverables.

Test Approach identifies the methodology that will be used in testing the product.

Test Deliverables outline the deliverables of Phase 1a.

Test Environment Requirements describes the hardware, software, data, and staffing
needs for the test effort. This section also describes any dependencies on the development
environment.

Assumptions, Dependencies, and Risks describes the assumptions, dependencies, and


risks of the projects performance tests.

Performance Test Work Plan describes the main milestones and outlines the project plan
for testing of the product.

TEST OBJECTIVES

The objectives of the Perf. Test Team are to ensure performance test scripts are generated and
executed, and the corresponding test execution reports are created for the XXXXX performance test
project. To determine the scope of the testing requirements, the Performance Test Engineer conducted
an assessment of the XXXXX system technical specifications. After the requirements have been
identified, summary and detailed test scripts will be constructed using the Mercury Interactive
LoadRunner testing software.
Estimates for the Test Script creation will be based on the creation of one test script per business
process, however, several different permutations may require testing and this validation requirement
would necessitate increasing the number of test scripts and the test script preparation estimated time.
The main objective of performance testing is to help ensure the XXXXX system performs within set
performance guidelines for the selected functional transactions running under varying user load
scenarios. The performance test team will work with the development team to define the required
functional transactions, user load scenarios, and performance guidelines.

TEST SCO PE

3. 1

Perf or m a n c e Tes t Sco p e

The performance test effort will focus on developing and executing a performance test model for the
XXXXX system. A performance test model consists primarily of virtual user scripts and load scenarios.
All activities required to perform a variety of XXXXX system transactions will be recorded into a virtual
user script. The virtual user script is used to simulate one or more users executing the business process.
A load scenario defines what virtual user scripts will be executed, how many simulated users will
execute each virtual user script, and how many iterations of each virtual user script is scheduled to be
executed by each user.
We will assist the development team in determining how many concurrent will be used and the
amount/duration of think time users that will be included in the performance test. We will also assist in
determining the transaction mix, as well as the speed setting for LoadRunner, which can handle modem
speeds from 2400 baud up to T1 speed.
3. 1 . 1 Goals of Performance Tests
1. Meet or exceed the following volume impact on the network based on the following figures:
Yearly
Annual

Expected No. of DIPs


per year / per day / per hour

Expected No. of Completed Applications


per year / per day / per hour

2005

20,563 / 66 / 9

17,136 / 55 / 7

2. Meet or exceed the following performance target figures from a user satisfaction perspective:
Business Function
Content Pages ( Product, Brochure & Company Info)
Registration
Logon/Logoff
Calculators
Decision in Principle
Application Form
Quick Address Query

Response Time
95% within 3 seconds from screen to screen
95% within 5 seconds from request to authorization
95% within 5 seconds from request to response
95% within 3 seconds from screen to screen using Save
or Back buttons
95% within 3 seconds from screen to screen using Save
or Back buttons
95% within 3 seconds from screen to screen using Save
or Back buttons
95% within 3 seconds from screen to screen using Save
or Back buttons

Calculator

95% within 15 seconds from request to response

DIP request
Case Tracking
Application Transfer

95% within 30 seconds from request to response


95% within 15 seconds from request to response
95% within 30 seconds from request to response

3. 2

Perf or m a n c e Tes t i n g Sc e n a ri o s

Scenario 1 PropTestCombo25
Gro u p N a m e
PropTestCombo1
PropTestCombo1_1
PropTestCombo1_2
PropTestCombo1_3
PropTestCombo1_4
PropTestCombo1_5
PropTestCombo1_6
PropTestCombo2
PropTestCombo2_1
PropTestCombo2_2
PropTestCombo3
PropTestCombo4
PropTestCombo4_1
PropTestCombo4_2
PropTestCombo4_3
PropTestCombo4_4
PropTestCombo4_5
PropTestCombo5
PropTestCombo6
PropTestCombo6_1
PropTestCombo6_2
PropTestCombo6_3
PropTestCombo6_4
PropTestCombo6_5
PropTestCombo7
PropTestCombo8

S c ri p t N a m e
PropTestCombo1
PropTestCombo1
PropTestCombo1
PropTestCombo1
PropTestCombo1
PropTestCombo1
PropTestCombo1
PropTestCombo2
PropTestCombo2
PropTestCombo2
PropTestCombo3
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo5
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo7
PropTestCombo8

Vu s e r Q u a n t i t y
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10

Total number of vusers = 250

Scenario 2 PropTestCombo50
Gro u p N a m e
PropTestCombo1
PropTestCombo1_1

S c ri p t N a m e
PropTestCombo1
PropTestCombo1

Vu s e r Q u a n t i t y
10
10

PropTestCombo1_2
PropTestCombo1_3
PropTestCombo1_4
PropTestCombo1_5
PropTestCombo1_6
PropTestCombo2
PropTestCombo2_1
PropTestCombo2_2
PropTestCombo3
PropTestCombo3_1
PropTestCombo4
PropTestCombo4_1
PropTestCombo4_2
PropTestCombo4_3
PropTestCombo4_4
PropTestCombo4_5
PropTestCombo4_6
PropTestCombo4_7
PropTestCombo5
PropTestCombo6
PropTestCombo6_1
PropTestCombo6_2
PropTestCombo6_3
PropTestCombo6_4
PropTestCombo6_5
PropTestCombo6_6
PropTestCombo6_7
PropTestCombo7
PropTestCombo8
PropTestCombo8_1
PropTestCombo8_2
PropTestCombo8_3

PropTestCombo1
PropTestCombo1
PropTestCombo1
PropTestCombo1
PropTestCombo1
PropTestCombo2
PropTestCombo2
PropTestCombo2
PropTestCombo3
PropTestCombo3
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo4
PropTestCombo5
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo6
PropTestCombo7
PropTestCombo8
PropTestCombo8
PropTestCombo8
PropTestCombo8

10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
50
50
10
10
10
10
10
10
10
50
50
10
10
10
10
10

Total number of vusers = 500

During the test execution stage of the performance test, each planned load scenario will be executed.
The system performance of the XXXXX system will be monitored and tracked during execution of each
load scenario.
For each test scenario, the following conditions apply:
All tests will ramp up 1 vuser every 15 seconds.

Each test group will be selected to start randomly.


Think times of 20 seconds per page are utilized.
All tests will utilize LAN speed (100mps).
All tests will run for an hour unless otherwise specified.

3. 2

Ele m e n t s of Work in Perf or m a n c e Tes t Sco p e

The following provides a list of the high-level activities to be considered in scope for the performance
test:

3. 4

Identification of the performance requirements.


Creation of the automated virtual user scripts.
Creation of the automated load scenarios plus the transaction mix.
Setting up the test environment for the test.
Setting up the test data required for the test.
Execution of the performance test model.
Enter performance defects in Excel spreadsheet.
Reporting test execution results/status (i.e. daily/weekly meetings, defect reporting).

Ele m e n t s of Work not in Perf or m a n c e Tes t Sc o p e

The following provides a list of the high-level activities to be considered out of scope for the testing
team:

4.

Fixing defects.
Unit testing the software components of the XXXXX system.

TEST AP PROACH

The approach to performance testing consists of executing testing in stages, which become
incorporated into the overall project development life cycle. Multiple execution cycles may occur in the
course of the testing project, based on the assessment of the quality of the test.
The generic high-level stages, which will be performed by the test team during the course of testing the
XXXXX system, will include:
Stage 1 Test Strategy Development. Creation of the XXXXX Performance Test Strategy and
high-level testing project work plan.
Stage 2 Requirements Review. Development of the XXXXX Performance Test Plan.
Stage 3 Test Script Design and Development. Preparation of test data, test cycle planning
and development of detailed test scripts.
Stage 4 Test Script Execution. Execution of test scripts, management of test results and
utilization of defect tracking process.

4. 1

Perf or m a n c e Tes t Appro a c h

4.1.1 Stage 1: Test Strategy Development


This document represents the Test Strategy.
4.1.2 Stage 2: Requirements Review
The performance requirements for the project will be determined by reviewing the High Level Design
document and the Detailed Design document, reviewing the Project Requirements document and, when
needed, conducting interviews with the subject matter experts for the application
At the conclusion of this stage, transaction mixes and load scenarios should be defined for the
performance test. The automated performance-testing tool uses the Perfmon and Netmon utilities of
Windows/NT to gather execution statistics from the performance test. The Perf. test group should
identify the performance statistics to be gathered as a result of the performance test.
4.1.3 Stage 3: Test Script Design and Development
During this stage of the performance test, the automated load scenarios will be built with the
LoadRunner Virtual User Generator and the LoadRunner Controller. Load scenario development
includes recording the selected transactions into virtual user scripts, parameterize input data for the
scripts, building any required verification points in the scripts, creating/configuring user groups, and
assigning virtual user scripts to user groups.
During this stage, the Perf. test group should configure the server monitor to capture the performance
statistics they require for the web server, the application server, and the database server.

4.1.4 Stage 4: Test Script Execution


During the test execution stage, the planned load scenarios are executed via the LoadRunner Controller
and Scheduler. Performance problems encountered during test execution will be entered into an Excel
spreadsheet as performance defects. After the defect fix is complete, the defect fix will be migrated into
the test environment and regression tested by the test team. If the defect fix requires a change to the
products application code, it should be functionally tested first to ensure that it does not affect the
functionality of the product. After the performance functional regression test is complete, the load
scenario should be re-executed to determine the defect fixs impact on performance. If the defect fix
requires a change to the technical environment (e.g., web server configuration, application server
configuration, database server, etc), the load scenario should be re-executed to determine the defect
fixs impact on performance.

5.

TEST DELIVERABLES

The Performance Test deliverables (indicated with a check mark) are:

5. 1

Proj e c t Tes t Deliv e r a b l e s

Performance Test Strategy (this document)


Status Reports
Performance Test Summary Document

5. 2

Perf or m a n c e Tes t Deliv e r a b l e s

Virtual User Scripts (LoadRunner)


Load Scenarios (LoadRunner)
Test Execution Reports (LoadRunner)

6.

TEST ENVIRONMENT REQUIREMENT S

6. 1

Confi g u r a t i o n

COMPONENT

HARDWARE

SOFTWARE

Test Client Workstations

Pentium III 500mhz


384mb RAM
Compaq DL 380 rack mounted unit
Dual Pentium III 1ghz Xeon CPU
1GB RAM
18.2gb SCSI mirrored drive
Compaq DL 380 rack mounted unit
Dual Pentium III 1ghz Xeon CPU
1GB RAM
18.2gb SCSI mirrored drive
Compaq DL 580 rack mounted unit
Dual Pentium III 1ghz Xeon CPU
1GB RAM
18.0gb RAID 5 drive 54gb total

Windows NT 4.0 Workstation


Explorer 5.01 or higher
Windows NT Server 4.0 SP6a
IIS 4.0
SQL Client
SMTP
Windows NT Server 4.0 SP6a
XXXXX

Web Servers

Application Servers

Database Servers

6. 2

Windows NT Server 4.0 SP6a


SQL Server 7.0 SP2

Tes t Dat a

Test scripts will be written to encompass the most complex portion of the identified business process
utilizing complex data, and if one test script does not accomplish the entire test for the business
process, multiple test scripts will be created.

7.

AS S UMPTION S , DEPENDENCIES , RISK S

7. 1

As s u m p t i o n s
The test lab is complete; all web servers, application servers, and database servers have been
acquired, installed, and tested.
The test lab includes client desktops running all of the browser/operating system combinations
required to test the XXXXX system.
The test lab provides access to third party vendors.
The test labs databases have sufficient data to test all major functional business requirements of
the XXXXX system.
All software components of the XXXXX system have been unit and integration tested by the
development team.
All requirements for the project have been implemented into the product.

7. 2

7. 3

Dep e n d e n c i e s
The test environment must be functional.
The XXXXX application software must function properly and be stable in order to complete
Load/Performance testing.

Risk s

If any of the assumptions or dependencies is not met, the XXXXX performance test project may not be
completed on time or may go over budget.

8.

PERFORMANCE TEST WORK PLAN

Preparation
Develop Test Plan .. 8/219/14/01
Configure Test Servers 8/27-9/14/01
Database Load . 9/10-9/14/01
Software Installation
Final Code Release ... 9/11/01
Final Installation Instructions Release .. 9/11/01
Software Installation .... 9/11-9/14/01
Validate Installation .. 9/14/01
Test Execution
Develop Test Scripts 9/179/19/01
Cycle 1 Tests Begins 9/199/26/01
PROP Test 1 Script .. 9/19/01
PROP Test 2 Script .. 9/20/01
PROP Test 3 Script .. 9/21/01
PROP Test 4 Script .. 9/24/01
Retests... 9/25/01
Retests .. 9/26/01
Cycle 2 Code Release .. 9/27/01
Cycle 2 Installation Instructions Release . 9/27/01
Cycle 2 Software Installation 9/27-10/02/01
Cycle 2 Validate Installation 10/02/01
Cycle 2 Tests Begins . 10/0310/10/01

PROP Test 1 Script ... 10/03/01


PROP Test 2 Script ... 10/04/01
PROP Test 3 Script ... 10/05/01
PROP Test 4 Script . 10/08/01
Retests . 10/09/01
Retests . . 10/10/01
Production Implementation
Installation Certification 10/11/01
Production Installation .. 10/12/01

You might also like