You are on page 1of 30

User Acceptance Testing with

SBR
(Scenario Based
Requirements)

Mary Larson
Testing
Services
INTERNAL January 2009
Testing is an organized activity that:

 Validates a system works as


expected

and

 Detects differences between


expected and actual results.

INTERNAL
Verification

 Testing is really verification, which is


part of our everyday activities:

 Balancing our checkbook.


 Waiting for a red stoplight to turn green
before proceeding.

 Science uses hypothesis and math proofs


– to verify situations and outcomes.

INTERNAL
Testing Guidelines
 We can not test everything.

 We need to make conscious


decisions about where to focus the
depth and intensity of testing.

 Use the requirement priority to


prioritize test cases for design and
execution.

INTERNAL
What is a Test Case?

 A test case checks a particular


situation or condition within a
system.

 Also known as: test condition, test


script, test scenario, test
specification, etc.

INTERNAL
Anatomy of a Test Case
 Test Name and Unique ID
 Linked to one or more Requirements

 Description (field contains)


 Description and Purpose

 Pre-conditions

 Input Data

 Post-conditions

 Test Steps
 Step Name

 Description (action to be performed)

 Expected Result
INTERNAL
Test Case Best Practices
 A good test case….

 is not redundant. Two test cases should not find the


same type of defect.

 is simple, easy to follow, and effective. It can be run by a


person with domain knowledge without consulting the
person who created it

 is repeatable. A test case, when repeated, will


consistently produce the same results.

 is executable independently and independent of other


test cases.
INTERNAL
Test Case Design – Best practices
 Limit test cases to 15 – 20 steps.

 Begin each test case with a Login step, initiate batch job step or other
common starting point.

 Ensure each test step is simple activity.

 Design test cases that are independent of other test cases (stand on their
own).

 Test Cases must be assigned to Test Sets before execution can occur.
 For details on how to create test sets and add test cases to them, refer
to Quality Center eLearning
 Contact TTS.Mercury for specific question on Quality Center usage

 Prioritize your test cases.


 Prioritize based on the Project Risk
 Helps to determine what to execute and execution order in case
testing scheduled is compressed, resulting in inadequate execution
time.

 Link test cases to requirements. Helps ensure requirements are covered


by test cases and allows you to demonstrate traceability.

INTERNAL
Identifying Potential Test Cases
 Derive Primary Test Case (happy path)
 Highest frequency of execution.

 Each Flow = 1 or more Test Case


 Lower frequency of execution.

 Prioritize the list of potential test cases


based on frequency of use, risk,
complexity; and eliminate redundancy to
optimize your testing.
INTERNAL
Major Risk Factors
 Customer or user impact
 Business operational risk
 Prior defect history
 New and modified features
 Heavily used features
 Complex features
 Exceptions
 Poorly built or maintained areas of
system

INTERNAL
Eliminate Redundancy
 Testing redundancy often occurs due to a
lack of planning and communication.

 Plan and collaborate with teams involved in the


testing.
 Determine which teams are responsible for the different
stages and types of testing that will be performed.
 Planning helps to find gaps, as well as identify
redundancy.
 Document the scope of each test stage in the Master
Test Plan.

 When designing test cases be alert for multiple


cases that produce the same behavior.
 If one situation works correctly, we can assume the
others are correct too. (We don’t need to test them all.)
INTERNAL
Test Case Prioritization

 The top 10% to 15% of the test cases


uncover 75% to 90% of the
significant defects.

 Risk prioritization is a method of


choosing the 10% to 15% which are
the most critical test cases.

INTERNAL
Developing Test Cases with SBR
 Scenarios specify the behavior of a system
from a user’s perspective.

 Provide a powerful source for generating


test cases.

 Scenario based testing identifies test cases


other techniques have difficulty seeing.

 Scenarios focus attention on users and


actions that they perform.

INTERNAL
User Acceptance Testing
 High-Level Requirements (REQs) are often used to
create test cases.

 Objective: User acceptance of the software by


allowing users to “test drive” and “kick the tires”.

 Acceptance criteria should be specified in the


Master Test Plan. It will vary based on level of
user involvement in project, business criticality,
number of releases/iterations, project timelines,
test stages performed, etc.

INTERNAL
Quality Center Test Case (Design Steps) Example
Quality Center is Target’s Test Management and Defect
Tracking Tool
 Provides a central repository for test cases, test execution results and
defects
 Customized fields and reports to support TSD best practices and
adherence
 Integrated with CaliberRM (Target’s Requirements Management Tool)

INTERNAL
Test Case Translation Techniques
 Requirements to Test Cases
 Functionally decompose requirements into test cases.
 Use Verify streams: e.g. Verify that when a change
request is created a unique ID is assigned to it.

 Functional Specification (FS) Flows to Test Cases


 FS Flows are detailed steps of user actions that
describe “process flows” in a system.
 Detailed actions in FS Flows allow them to be easily
translated into test cases.
 These test cases are useful in uncovering “real use”
defects and integration defects.

INTERNAL
Creating Detailed Test Cases
Test Cases should be derived from Requirements,
Business Rules and Functional Specifications.
 To ensure traceability Requirements should be linked to Test Cases
 Note: Some test case details may not be discovered until technical design.

Requirement A
+
Functional Specification Flow(s)
+
Business Rule(s)

Test Case 1 Test Case 2

 Note: Requirement A and Test Case 1 (2) are made up names for this
example. Requirement and test case name should be descriptive and
unique.
INTERNAL
Creating Test Cases using SBR

INTERNAL
Using Copy/Paste to Clone similar Test Case
Created Test Case 1 in Quality Center (see previous slide)
 Test Case derived from:
 2.1 REQ - Ability to search for existing customer information
 FL-1.1 Search Customer (Call Rep and Customer Service Manager)

 Note that each Test Step contains the following information:


 Step Name
 Description (action)
 Expected Result

Test Case 2 (next slide) was cloned from Test Case 1


 Quality Center’s copy/paste function was used to clone Test Case 2
 Note that cells with green font have been updated.
 This test case also applied 2 RUL to the step information.
 Only Step 3’s Description and Step 5’s Expected Result needed to be
updated.

INTERNAL
Creating Test Cases using SBR (Con’t)
Item Description
Requirement A 2.1 REQ - Ability to search for existing customer information

+ Business Rule2 RUL - A wildcard search must be available for all search
criteria
+ Functional Spec - FL-1.1 Search Customer (Call Rep and Customer Service
Manager)
Step # Step
1 On home page user indicates call starts by clicking on Start Call button. The system records the Call start time and Call Rep
2 The user clicks the Start button and the Customer Search Page displayed.
a.Last Name – data entry, required
b.State – dropdown menu of all states, required, defaulted to (Select One)
3 The user enters all of the required information and clicks the Search button
4 The system refreshes the Customer Search page with data pre-populated in the results table
= Test Case 2 (covers 2.1 REQ, FL-1.1 and 2 RUL) *focus is on applying 2 RUL

Step Description Expected Result


Name
Step 1 On home page user indicates call starts by clicking The system records the Call start time and Call Rep.
on Start Call button.
Step 2 The user clicks the Start button. The Customer Search Page displayed.
Step 3 The user enters a valid Char + wildcard combo Data entry is allowed.
(***need to know what the wildcard value is)
Step 4 The user selects a State from the dropdown menu. The State can be selected and the correct State value (item selected) is
displayed.
Step 5 The user clicks the Search button. The Customer Search page is refreshed and the correct data for ALL
customer s matching the Char + wildcard combo is displayed in the results
INTERNAL
table.
Test Case Attributes (in Quality Center)
 Test Case Description
 A description of what the test case is expected to verify/validate
 Example: Verify that a user with project setup permissions can create a new Project by
entering these valid, mandatory inputs: …
 Example: Verify that a user with project setup permissions can assign an existing team to a
newly created Project.
 Example: Verify that a user with baseline permissions can modify tasks and re-baseline a
Project that has previously been baselined.
 Test Case Name
 Include a unique ID that identifies the test case.
 For Ex TC-CLARITY-CreateProject-001
 Test Step Description
 Includes actions to be taken by the user to complete the test step.
 For Ex. Enter Login and Password on the Login Page, then press <OK>.
 A group of test steps in a sequence make up a test case.
 Test Step Expected Result
 The expected outcome from performing the Test Step actions (description)
 For Ex. System should Login the user successfully and display the Application Overview Page

INTERNAL
Test Case Attributes (in QC Con’t)
 Actual Result
 Actual Result of the test step, as compared to the Expected Result. The actual result
determines the test step status – Pass or Fail
 Test Case Pre Conditions
 Pre-conditions determine conditions or state the system should be in before executing
the test case
 For Ex. To Test the assignment of resources to a Project, the resources should exist in the System.
 For Ex. To Test Post Timesheet, there should be at least one Timesheet in Approved status for the test user.

 Linked Requirement
 The linked requirement(s) for which the Test case is designed to be executed. Linking
requirements to test cases helps to assess requirements coverage and traceability.
 Test Case Post Condition
 The state of the system after successful execution of the test case
 For Ex. Project Created
 For Ex. Time sheet in Approved Status
 Not required for every test case
 Can assist with planning and ordering test cases for execution.

Additional attributes like Test Stage, Test Type, Priority are available in Quality
Center. INTERNAL
Quality Center User’s Reference Site
 http://sharepoint.target.com/sites/qualitycenter/default.aspx

INTERNAL
Quality Center Process Flow
 Link on Quality Center User’s Reference Site

INTERNAL
Test Case Design Techniques
Black Box – Equivalence Partitioning Example

Exam Marks
0 ≥ Marks ≤ 100

… -1 101 …

 Equivalence Partitioning
 Identify the different logic conditions for this example

 Select Test Cases from each of the specified conditions

 Group possible Test Cases based on conditions. Executing one Test Case in the
group is equivalent to executing other Test Cases in the same group.

INTERNAL
Test Case Design Techniques
Black Box Testing – Case Study

Marks > 100 OR < 0 Print Error

Marks >= 90 Print A

Marks >= 80 AND < 90 Print B

Marks >= 70 AND < 80 Print C

Marks >= 60 AND < 70 Print D

Marks < 60 Print F

Note: Each of the above conditions represents a test case

INTERNAL
Test Case Design Techniques
Black Box Testing – Case Study

Case Study

 DPS Summary

The system generates warnings and notifications while processing the Fiscal Month
End Process.

Within 24 - 36 hours before Fiscal Month End Process : Generate Mail


After completion of Fiscal Month End Process : Generate Email on Status
Failure of Process : Generate Error message
and create a log file
Standard Fiscal Month End DPS Summarization incomplete : Generate warning
message

Develop Test cases using Equivalence class and Boundary value analysis.

INTERNAL
Test Case Design Techniques
Black Box – Boundary Value Analysis by example

Action: Generate Email to repopulate DPS driver table.


Condition: Email will be generated 24 to 36 hrs prior to Process start

Generate E Mail

… >3 36 24 <24 …
6

 Boundary Value Analysis


 Identify Test Cases to test the ‘Edge’ conditions of boundaries

 Helps to exercise boundary values

 Complements Equivalence Partitioning technique

INTERNAL
Test Case Design Techniques
Test Case – Error Handling

Error Handling

 Tests that ensure the System doesn't crash, lose data, or otherwise break
when users do something that wasn't planned or anticipated by developers.

 Tests that an appropriate error/warning message is displayed at the right


time in the correct format. Requirements (and business rules) need to
provide sufficient details for design, coding and testing activities.

 For Ex. Attempt to enter data outside the boundaries. The software
should not permit this.
 Make illegal combination choices. The software should prevent the
action by disabling fields, and/or display an appropriate error message.
 Turn off the computer during a transaction. The software should not
lose data.

INTERNAL
Q&A

INTERNAL

You might also like