You are on page 1of 17

Test Plan Template

(Name of the Product)

Prepared by:
(Names of Group Members)

(Date)
TABLE OF CONTENTS
1.0 Introduction
2.0 Objectives and Tasks
2.1 Objectives
2.2 Tasks
3.0 Scope
4.0 Testing Strategy
4.1 Alpha Testing (Unit Testing)
4.2 System And Integration Testing
4.3 Performance And Stress Testing
4.4 User Acceptance Testing
4.5 Automated Regression Testing
4.6 Beta Testing
4.7Compatibility Testing
5.0 Hardware Requirements
6.0 Environment Requirements
6.1 Main Frame
6.2 Workstation
7.0 Test Schedule
8.0 Control Procedures
9.0 Features To Be Tested
10.0 Features Not To Be Tested
11.0 Resources/Roles & Responsibilities
12.0 Schedules
13.0 Significantly Impacted Departments (Sids)
14.0 Dependencies
15.0 Risks/Assumptions
16.0 Tools
17.0 Approvals
1.0 INTRODUCTION
A brief summary of the product being tested. Outline all the functions at a high level.

2.0 OBJECTIVES AND TASKS


2.1 Objectives of Test Plan
Describe the objectives supported by the Master Test Plan, eg., defining tasks and
responsibilities, vehicle for communication, document to be used as a service level
agreement, etc.

2.2 Tasks of Test Plan


List all tasks identified by this Test Plan, i.e., testing, post-testing, problem reporting,
etc.
We assume each implemented functional req. as a test scenario and for every test
scenario, we prepare a set of test cases.Below is the sample requirements
traceabilitytable for task mapping (#BR > #FR > #TS > #TA)

(Add your project functionalities in this table and prepare test scenarios)
#Testing
#Business #Functional
Sr. #Test Scenarios Approaches
Requirements Requirements
No. (TS) /Strategies
(BR) (FR)
(TA)
01 Reduce Call Center Forums or Forum Board
and other support content (Create Post
system costs per co-creation /Edit Post
incident (e.g. wikis) /Display Post
/Rank Post
/Delete Post) 1.Functional Testing
02 Increase customer Gamification& Leaderboard (Standard Testing)
loyalty and Leaderboard (Rank Customer 2.
engagement Creation /Discount Non-Functional
Offerings) Testing
3.0 SCOPE
General
This section describes what is being tested, such as all the functions of a specific
product, its existing interfaces, integration of all functions. List here how you will
accomplish the items that you have listed in the "Scope" section. For example, if you
have mentioned that you will be testing the existing interfaces, what would be the
procedures you would follow to notify the key people to represent their respective
areas, as well as allotting time in their schedule for assisting you in accomplishing
your activity?

4.0 TESTING STRATEGY


Describe the overall approach to testing. For each major group of features or feature
combinations, specify the approach which will ensure that these feature groups are
adequately tested. Specify the major activities, techniques, and tools which are used to
test the designated groups of features.
The approach should be described in sufficient detail to permit identification of the
major testing tasks and estimation of the time required to do each one.

Refer and choose test strategies for Web-based Applications from following table
Refer and choose test strategies for Desktop/Standalone Applications from
following table

1. Functional testing (Generally required for all products. The purpose of functional
testing is to reveal defects related to the product’s/component’s functionality and
conformance to documented functional requirement specifications.)
2. Unit testing - usually accomplished by developers;
3. Structural testing
4. Exploratory testing (Always write down what you do and what happens when you
run exploratory tests.)
5. Component/Sub-component testing
6. Walkthroughs, inspections, desk-checking
7. Verification (e.g., reviews, examinations, walkthroughs, desk-checking, or
inspection of interim work products such as requirements, design, specifications,
documentation, prototypes, code; early detection of errors are highly cost effective.)
8. Developmental integration testing
9. Developmental system testing
10. User acceptance testing (Generally required for all products. The purpose of
acceptance testing is convincing the user that the product fulfills expected user
needs.)
11.Performance/load/stress testing
12.Security/access testing
13.Usability testing
14.Operational procedure documentation verification
15.Regression testing (Reliability)
16.Alpha & Beta Testing
17. Smoke Test - …establish that the system is stable and all major functionality is
present and works under ‘normal’conditions
18.Pilot testing
19. Recovery testing - …can involve the manual functions of an application, loss of
input capability, loss of communicationlines, hardware or operating system failure,
loss of database integrity, operator error, or application system failure.
20.Operations testing / Usability Testing (Ease of Operations)
21. Compliance testing - …verifies that the application was developed in accordance
with information technology standards,procedures, and guidelines.
22. Manual-Support Testing - Systems commence when transactions originate and
conclude with the use of the results ofprocessing. The manual part of the system
requires the same attention to testing as does the automated segment.
23.Intersystem [interface] Testing
24. Parallel Testing (e.g., matching test results between current live system and new
system.)
25. Compliance Testing (Authorization) - Testing should verify that the authorization
rules have been properly implemented and evaluate compliance with them. Test
conditions should include unauthorized transactions or processes to ensure that
they arerejected, as well as ensuring authorized transactions are accepted.

And preparea test strategy table as below (which will includeshortlisted testing
approaches, you may add your own testing approaches as per your project need)

Manual Testing Automated


Type of
Approach Using Using Testing on Tools/APIs/Libraries
Testing
Device Emulator Device
Unit Testing No Yes No 1. Junit
Integration (unit testing framework)
No Yes No
Standard Testing 2. Selenium WebDriver/IDE
Testing System Testing Yes No Yes (web application
(Functional Regression automation testing
Yes No Yes framework)
Testing) Testing
Acceptance 3. Jmeter
Yes No No (performance/load testing)
Testing
Special Type Compatibility 4. Appium
Yes No Yes (mobile app automation
of Testing to Testing
Address GUI Testing Yes No No framework)
Specific 5. Responsive Design Mode
Challenge (responsive web design
Performance emulator)
Testing 6. IronWASP
Client-
Security 7. HP LoadRunner
specific - - -
Testing
Testing
- And many more….
-

Describe every testing strategy and prepare tables for it.

4.1 Unit Testing


Definition:
Specify the minimum degree of comprehensiveness desired. Identify the techniques
which will be used to judge the comprehensiveness of the testing effort (for example,
determining which statements have been executed at least once). Specify any
additional completion criteria (for example, error frequency). The techniques to be
used to trace requirements should be specified.
Participants:
List the names of individuals/departments who would be responsible for Unit Testing.
Methodology:
Describe how unit testing will be conducted. Who will write the test scripts for the
unit testing, what would be the sequence of events of Unit Testing and how will the
testing activity take place?

MODULE/FUNCTIONALITY
NAME:
UNIT/CLASS:
CREATED BY:
DATE OF CREATION:
DATE OF REVIEW:

TEST
TEST TEST PRE- TEST TEST EXPECTED POST ACTUAL STATUS
CASE
UNIT/CLASS CASE CONDITION STEPS DATA RESULT CONDITION RESULT (PASS/FAIL)
ID
4.2 System and Integration Testing
Definition:
List what is your understanding of System and Integration Testing for your project.
Participants:
Who will be conducting System and Integration Testing on your project? List the
individuals that will be responsible for this activity.
Methodology:
Describe how System & Integration testing will be conducted. Who will write the test
scripts for the unit testing, what would be sequence of events of System & Integration
testing, and how will the testing activity take place?

PROJECT NAME:
MODULE/FUNCTIONALITY:
CREATED BY:
DATE OF CREATION:
DATE OF REVIEW:

TEST
TEST TEST PRE- TEST TEST EXPECTED POST ACTUAL STATUS
CASE
SCENARIO CASE CONDITION STEPS DATA RESULT CONDITION RESULT (PASS/FAIL)
ID

4.3 Performance and Stress Testing


Definition:
List what is your understanding of Stress Testing for your project.
Participants:
Who will be conducting Stress Testing on your project? List the individuals that will
be responsible for this activity.
Methodology:
Describe how Performance & Stress testing will be conducted. Who will write the test
scripts for the testing, what would be sequence of events of Performance & Stress
Testing, and how will the testing activity take place?

PROJECT NAME:
MODULE/FUNCTIONALITY:
CREATED BY:
DATE OF CREATION:
DATE OF REVIEW:

Web SENT RECVD


# Samples Average Min Max Std. Dev. Throughput Avg. Bytes
Page ID KB/sec KB/sec

4.4 User Acceptance Testing


Definition:
The purpose of acceptance test is to confirm that the system is ready for operational
use. During acceptance test, end-users (customers) of the system compare the system
to its initial requirements.
Participants:
Who will be responsible for User Acceptance Testing? List the individuals' names and
responsibility.
Methodology:
Describe how the User Acceptance testing will be conducted. Who will write the test
scripts for the testing, what would be sequence of events of User Acceptance Testing,
and how will the testing activity take place?

PROJECT NAME:
MODULE/FUNCTIONALITY:
CREATED BY:
DATE OF CREATION:
DATE OF REVIEW:

#Business Req. #Functional Req.


ID Test Description Step # Test Steps Exp. Results
Covered Covered

4.5 Automated Regression Testing


Definition:
Regression testing is the selective retesting of a system or component to verify that
modifications have not caused unintended effects and that the system or component
stillworks as specified in the requirements.
Participants:
Methodology:

Prepare table as per your need.

4.6 Beta Testing


Participants:
Methodology:A beta test is the second phase of software testing in which a sampling
of the intended audience tries the product out.
Prepare table as per your need.

4.7 Compatibility Testing:


Definition: Compatibility Testing is a type of Software testing to check whether your
software is capable of running on different hardware, operating systems, applications ,
network environments or Mobile devices
Participants: ….
Methodology: ….

Hardware
Specification Operating Interactive Testing
ID Telecom Network Browsing Application Comments
(Processor/Clock System (PASS/FAIL)
Speed/RAM)

5.0 HARDWARE REQUIREMENTS


Computers

Modems

6.0 ENVIRONMENT REQUIREMENTS


6.1 Main Frame
Specify both the necessary and desired properties of the test environment. The
specification should contain the physical characteristics of the facilities, including the
hardware, the communications and system software, the mode of usage (for example,
stand-alone), and any other software or supplies needed to support the test. Also
specify the level of security which must be provided for the test facility, system
software, and proprietary components such as software, data, and hardware. Identify
special test tools needed. Identify any other testing needs (for example, publications or
office space). Identify the source of all needs which are not currently
available to your group.

5.2 Workstation

7.0 TEST SCHEDULE


Include test milestones identified in the Software Project Schedule as well as all item
transmittal events. Define any additional test milestones needed. Estimate the time
required to do each testing task. Specify the schedule for each testing task and test
milestone. For each testing resource (that is, facilities, tools, and staff), specify its
periods of use.
Task Name Start Finish Effort Comments
Date Date Estimation
Test Planning
Review Requirements documents 2d
Create initial test estimates 1d
Staff and train new test resources
First deploy to QA test environment
Functional testing – Iteration 1
Iteration 2 deploy to QA test
environment
Functional testing – Iteration 2
System testing
Regression testing
User Acceptance Testing
Resolution of final defects and final
build testing
Deploy to Staging environment
Performance testing
Release to Production

8.0 CONTROL PROCEDURES


Problem Reporting
Document the procedures to follow when an incident is encountered during the testing
process.
If a standard form is going to be used, attach a blank copy as an "Appendix" to
the Test Plan. In the event you are using an automated incident logging system, write
those procedures in this section.

Change Requests
Document the process of modifications to the software. Identify who will sign off on
the changes and what would be the criteria for including the changes to the current
product. If the changes will affect existing programs,these modules need to be
identified.

9.0 FEATURES TO BE TESTED


Identify all software features and combinations of software features that will be tested.
10.0 FEATURES NOT TO BE TESTED
Identify all features and significant combinations of features which will not be tested
and the reasons.

11.0 RESOURCES/ROLES & RESPONSIBILITIES


Specify the staff members who are involved in the test project and what their roles are
going to be (for example, Mary Brown (User) compile Test Cases for Acceptance
Testing). Identify groups responsible for managing, designing, preparing, executing,
and resolving the test activities as well as related issues. Also identify groups
responsible for providing the test environment.These groups may include developers,
testers, operations staff, testing services, etc.

12.0 SCHEDULES
Major Deliverables {Mandatory}
Identify the deliverable documents. You can list the following documents:

Deliverable For Date / Milestone


Project Manager; QA
Test Plan
Director; Test Team
Project Manager; QA
Traceability Matrix
Director
Test Status report QA Manager, QA Director
Test Result Report Project Manager
13.0 SIGNIFICANTLY IMPACTED DEPARTMENTS (SIDs)

Department/Business Area Bus. Manager Tester(s)

14.0 DEPENDENCIES
Identify significant constraints on testing, such as test-item availability, testing-
resource availability, and deadlines.

15.0 RISKS/ASSUMPTIONS
Identify the high-risk assumptions of the test plan. Specify contingency plans for each
(for example, delay in delivery of test items might require increased night shift
scheduling to meet the delivery date).

16.0 TOOLS
List the Automation tools you are going to use.

List also the Bug tracking tool here.

17.0 APPROVALS

Specify the names and titles of all persons who must approve this plan. Provide space
for the signatures and dates.

Name (In Capital Letters) Signature Date


1.
2.
3.
4.
Defect Report Template
In most companies, a defect reporting tool is used and the elements of a report can vary.
However, in general, a defect report can consist of the following elements.

ID Unique identifier given to the defect. (Usually, automated)


Project Project name.
Product Product name.
Release Version Release version of the product. (e.g. 1.2.3)
Module Specific module of the product where the defect was detected.
Detected Build Build version of the product where the defect was detected (e.g.
Version 1.2.3.5)
Summary Summary of the defect. Keep this clear and concise.
Description Detailed description of the defect. Describe as much as possible but
without repeating anything or using complex words. Keep it simple but
comprehensive.
Steps to Replicate Step by step description of the way to reproduce the defect. Number the
steps.
Actual Result The actual result you received when you followed the steps.
Expected Results The expected results.
Attachments Attach any additional information like screenshots and logs.
Remarks Any additional comments on the defect.
Defect Severity Severity of the Defect. (See Defect Severity)
Defect Priority Priority of the Defect. (See Defect Priority)
Reported By The name of the person who reported the defect.
Assigned To The name of the person that is assigned to analyze/fix the defect.
Status The status of the defect. (See Defect Life Cycle)
Fixed Build Version Build version of the product where the defect was fixed (e.g. 1.2.3.9)

You might also like