You are on page 1of 11

SYSTEM TEST SCENARIOS

<Company Long Name>


<Subject>
Note: When this template is completed, delete all
the yellow notes as follows:

1. With your document open, open the Find


and Replace window.
2. Expand the Find and Replace window using
the More>> button.
3. Place the cursor in the Find what: field.
4. Choose Format > Style.
5. When the Find Style box opens, scroll
through the list and select Note and click OK.
6. Leave the Replace with: field blank.
7. Select Replace, Replace All or Find Next to
remove the yellow notes.
8. Repeat using the Note Wide style if there
are any landscape sections that contain wide
yellow notes.

Author: <Author>
Creation Date:
Last Updated:
Document Ref: <Document Reference Number>
Version: DRAFT 1A

Note: Title, Subject, Last Updated Date,


Reference Number, and Version are
marked by a Word Bookmark so that they can
be easily reproduced in the header and footer
of documents. When you change any of
System Test Scenarios Doc Ref: <Document Reference Number>

these values, be careful not to accidentally


delete the bookmark. You can make
bookmarks visible by selecting the Office
Button>Word Options>Advanced> and
checking the Show bookmarks option in
the Show document content region.

Approvals:

<Approver 1>

<Approver 2>

Note: To add additional approval lines, press [Tab]


from the last cell in the table above.

Note: You can delete any elements of this cover


page that you do not need for your
document.

<Subject> Document Control ii


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

1 Document Control

1.1 Change Record


4

Date Author Versio Change Reference


n

8-May-14 <Author> Draft No Previous Document


1a

1.2 Reviewers

Name Position

<Subject> Document Control iii


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

Contents

1 Document Control ......................................................................... iii


1.1 Change Record ................................................................................ iii
1.2 Reviewers ....................................................................................... iii

2 Introduction ..................................................................................1
2.1 Scope and Purpose ........................................................................... 1
2.2 How to Review ................................................................................. 1
2.3 Related Documents ........................................................................... 1

3 <Application Name> Test Scenario ................................................2


3.1 System Test Scenarios for <area> ...................................................... 2

4 <Application Name> Defect Report ...............................................5

Note: To update the table of contents, put the


cursor anywhere in the table and press [F9].
To change the number of levels displayed,
select the menu option Insert->Index and
Tables, make sure the Table of Contents tab
is active, and change the Number of Levels to
a new value.

<Subject> Document Control iv


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

2 Introduction

2.1 Scope and Purpose


This document details the test scenarios that will be used in the execution of System Test.
The purpose of a test scenario is to identify and communicate steps that should be performed, their
expected results, and under which conditions the test scenario should run. Test scenarios are
necessary to verify successful and acceptable implementation of the requirements.

2.2 How to Review


Please use the following criteria when reviewing the System Test Scenarios:

• Have all System Test Scenarios described in the System Test Plan been detailed, or
has there been explained why there is a discrepancy?
• Are the System Test Scenarios related to specific use case scenarios or other type of
requirements?
• Is there for each scenario described test steps in sufficient detail for a tester to
understand without consulting others?
• Is there for each scenario step described an understandable expected outcome?
• Have the pre-conditions for the execution of the scenarios been described?
• Has the required test data been described for each test scenarios?

2.3 Related Documents

<Subject> Introduction 1 of 7
File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

3 <Application Name> Test Scenario

Note: This section presents a list of the test cases,


which represent plausible usage of the system
by end users in terms of a series of tasks that
the user will want to execute. It should be
organized by use case, for each use case in
the application system.

Test Cases have been developed for each Use


Case defined in the application system and
one or more test cases should be developed
for each documented flow of events (use case
scenarios) of the use case.

3.1 System Test Scenarios for <area>


Note: The <area> may be a functional area or some other means of grouping
the system test scenarios.
Note: It is preferred that all the System Test Scenarios are documented in a
defect logging system. If you use such a system, then do not duplicate the
scenarios here. You a report from the defect logging system if you need it
on paper, for example for (quality) review purpose.
Note: Test Case ID: Unique Test Case ID

Test Case Name: Test Case Name

Use Case Scenario: Basic flow or basic flow with alternate flows

Revision History: Each test case has to have its revision history in order
to know when and by whom it is created or modified.

Objectives: Short sentence or two about the aspect of the system is


being tested. If this gets too long, break the test case up or put more
information into the feature descriptions.

Test Setup: Describe in all details the state of the application as well as
any element of the system environment that is of importance during the
test; e.g., particular HTTP port enabled.

Pre-conditions: Assumptions that must be met before the test case can
be run. E.g., " User is not already logged in".

Test Data: List of variables and their possible values used in the test
case. You can list specific values or describe value ranges.
E.g., usernameOrEmail = {testuser, bogususer, testuser@website.com,
empty},
password = {valid, invalid, empty}

<Subject> <Application Name> Test Scenario 2 of 7


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

Test Procedure: Describe in all details the steps that the user has to
follow. Be very specific, referencing the specific buttons or links to click, or
input fields to fill.

Expected Results: Describe the test case's expected output.


E.g. Message displayed: This user already exists. Please enter another
user.;
Login screen appears
Successful cash withdrawal.

Comments: Additional comments.

Actual Results: Pass/Fail - current state of the test case

Completed: Date

Signed Out: Name of Tester

Test Log: The test log is a record of the test results. It contains the date,
the name of the tester, if the test passed or failed and why it failed (if
applicable).

Test Scenario ID Test Scenario


Name

<Indicate the requirement(s) that is tested under this scenario. This could for example be a
Requirement(s) system use case scenario, but could also be some other requirement, such as a supplemental
requirement.>

Revision History

Date Author Version Description

Test Case created.

Objectives

Test Setup
Requirements
Pre-conditions

Test Data

Variable Values

Test Procedure

Step # Step Details Expected Result

01
02
03

Comments

<Subject> <Application Name> Test Scenario 3 of 7


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

Test Scenario ID Test Scenario


Name

<PASS/FAIL/SKIPPED>
Actual Results

Completed

Signed Out

Test Log

Date Tester Name Actual Results Comments


(Pass/Fail)

<Subject> <Application Name> Test Scenario 4 of 7


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

4 <Application Name> Defect Report


The following provides a template to report defects found during the execution of the test cases.

Note: Defects should preferably be logged in a


defect logging system. Only use the template
on the next page if you do not have such a
system in place.

Note: Test Case ID: Unique Test Case ID

Version & Release Number: The version of the application that you are
testing.

Report Type: Describes the type of problem found, for example it could
be software or hardware defect.

Severity:

- Low: This is for Minor problems, such as failures at extreme boundary


conditions that are unlikely to occur in normal use, or minor errors in
layout/formatting. These problems do not impact use of the product in
any substantive way.

- Medium: This is a problem that:

a) Affects a more isolated piece of functionality.


b) Occurs only at certain boundary conditions.
c) Has a workaround (where "don't do that" might be an acceptable
answer to the user) or
d) It is very intermittent

- High: This should be used for only serious problems, with no


workaround. Frequent or reproducible crashes/core dumps would fall in
this category, as would major functionality not working.

- Urgent: This should be reserved for only the most catastrophic of


problems. E.g. data corruption or complete inability to use the
application.

Problem Summary: Describes what the problem is.

Environment: Environment in which the bug is found.

Reported by: The name and email of person who writes the report.

Detailed Description: Details of the defect that is found.

How to Reproduce: Detailed description of how to reproduce the defect.

Assigned to Developer: The name and email of developer who assigned

<Subject> <Application Name> Defect Report 5 of 7


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

to fixed the defect.

Status:

- Open: The status of defect when it entered.


- Fixed: The status of the defect when it fixed.
- Closed: The status of the defect when verified.
- Deferred: The status of the defect when it postponed.
- Not a Defect: The status of the defect when it is not a defect.

Priority: Assigned by the project manager who asks the programmers to


fix defects in priority order.

Resolution: Defines the current status of the problem. There are four
types of resolution such as deferred, not a problem, will not fix, and as
designed.

Cost/Time to Fix: Estimation of the cost and time to fix the defect.

Authorization: The name and email of person who authorizes the


correction.

Re-tested by: The name and email of person who re-testes the test case.

Defect Form

Test Case ID Version & Release Number


Report Type: Severity: Low/Medium/High/Urgent
Problem Summary:
Environment:

Reported by:

Date:
Detailed Description:

How to Reproduce:

Assigned to Developer: Completion Date:

Status: Priority:

Resolution:

Cost/Time to Fix:

Authorization:

Date:

<Subject> <Application Name> Defect Report 6 of 7


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )
System Test Scenarios Doc Ref: <Document Reference Number>

Defect Form

Re-tested by:

Date:
Page __ of ___

<Subject> <Application Name> Defect Report 7 of 7


File Ref: SYSTEM_TEST_SCENARIOS (v. DRAFT 1A )

You might also like