You are on page 1of 55

Testing and Testing

Methodologies
- Basics
Training Objectives

 The objective of this training is to


provide an overview of Testing principles
and practices

07/07/09 Software Testing Practice 2


 Introduction to software testing
 Testing in software lifecycle
 Testing methodologies
 Introduction to testing practices

07/07/09 Software Testing Practice 3


Software Testing & Quality

Quality Assurance Quality Control


is planned and is a work bench check
systematic activities procedure where
necessary to provide correctness is
adequate confidence determined and action
that products and is initiated when non-
services are defect conformance is
free detected
William Perry William Perry

Testing is a ‘Safety Net’


-Dr. Roger Pressman
07/07/09 Software Testing Practice 4
Why Bug exist?? Common Reasons
Poor understanding and incomplete requirements

 Unrealistic schedule

 Fast changes in requirements

 Too many assumptions and complacency

07/07/09 Software Testing Practice 5


Why Testing ?? Some failures
Some of major computer system failures listed below gives ample
evidence that the testing is an important activity of the software quality
process.
 In April of 1999 a software bug caused the failure of a $1.2 billion
military satellite launch, the costliest unmanned accident in the history
of Cape Canaveral launches. The failure was the latest in a string of
launch failures, triggering a complete military and industry review of
U.S. space launch programs, including software integration and testing
processes. Congressional oversight hearings were requested

07/07/09 Software Testing Practice 6


Why Testing ?? Some failures
• Software bugs caused the bank accounts of 823 customers of a major
U.S. bank to be credited with $924,844,208.32 each in May of 1996,
according to newspaper reports. The American Bankers Association
claimed it was the largest such error in banking history. A bank
spokesman said the programming errors were corrected and all funds
were recovered.
• In April of 1999 a software bug caused the failure of a $1.2 billion
military satellite launch, the costliest unmanned accident in the history
of Cape Canaveral launches. The failure was the latest in a string of
launch failures, triggering a complete military and industry review of
U.S. space launch programs, including software integration and testing
processes. Congressional oversight hearings were requested

07/07/09 Software Testing Practice 7


Hurdles normally encounters are:
 Usually late activity in the project life cycle
 No “concrete” output and therefore difficult to measure the value
addition
 Lack of historical data
 Recognition of importance is relatively less
 Politically damaging as you are challenging the developer
 Delivery commitments
 Too much optimistic that the software always works correctly

07/07/09 Software Testing Practice 8


Introduction To Software Testing
 Testing is the process of assuring that a product meets end-user
requirements
 Executing a program with the intent of finding errors
 A good test is one that has a high probability of finding
undiscovered errors
 All tests should be traceable to customer requirements and
repeatable
 Tests should be planned long before testing begins
 Exhaustive testing is not possible
 To be most effective, testing should be conducted by an
independent testing group ( ITG ) or Independent testing team

Testing cannot show the absence of software errors, it


can only show their presence
07/07/09 Software Testing Practice 9
Who Should Test The Software?

Developer understands the system but will test ‘Gently’ and is driven by ‘Delivery’

ITG or Independent testing team will attempt to break and is driven by ‘Quality’

USER will verify acceptance criteria and is driven by ‘Cost’

07/07/09 Software Testing Practice 10


 Introduction to software testing
 Testing in software lifecycle
 Testing methodologies
 Introduction to testing practices

07/07/09 Software Testing Practice 11


Software Development Life Cycle
UAT User Acceptance
URS Testing
planning

System test System


SRS
planning Testing

Verification Validation
Delivery
Integration test Integration production
HLD deployment
planning Testing

Maintenance
Unit test Unit and
LLD testing
planning enhancement

Coding
07/07/09 Software Testing Practice 12
Software Testing Life Cycle
Test scope Test planning Test engineering Test execution Defect analysis
- Approach - Check unexpected
- Process & tools - Test design - Implement stubs behavior
- Baseline inventory
- Methodology - Formal specs - Test data feeders - Identify defective
- Acceptance criteria
- Delivery models - Test scenarios - Batch processes application areas
- Schedule
- Risk plan - Test cases - Execute testing - Identify erroneous
- Prioritization
- Project workflow - Test data - Collate test data test data
- Test references
- Quality objectives - Tool development - Identify bugs - Identify defect
- Signoff
- Configuration plan trends / patterns
requirement

Test Closure
- Stop Testing
- Prepare Reports
- Prepare Test closure
Document.

07/07/09 Software Testing Practice 13


What is Verification & Validation ?

Verification
All “REVIEW” activities throughout the life cycle that ensure
the product deliverables meet their specifications

Validation
The “TEST” phase of the life cycle, which ensures that the
end product meets the specifications

Verification - Are we building the product right ?

Validation - Are we building the right product ?

07/07/09 Software Testing Practice 14


 Introduction to software testing
 Testing in software lifecycle
 Testing methodologies
 Introduction to testing practices

07/07/09 Software Testing Practice 15


Categories of Testing

 Testing Levels
 Techniques
 Unit Testing Structural
 (White
Box)
 Integration
 Functional (Black
 System
Box)
 Acceptance
Unit Testing - Structural Testing
 Risk Based
Integration - Structural and Functional
 Heuristic
System - Functional, Risk based and Heuristic
Acceptance - Functional, Risk based and Heuristic
07/07/09 Software Testing Practice 16
Structural Testing (White Box)
Inputs
 Is a technique
where
Program
structure/spec
s used to
define test
case
 Program

Outputs
viewed as a
07/07/09 Software Testing Practice graph 17
White box Testing Methods

 Statement Coverage
 Branch Coverage
 Loop Coverage
 Data flow

07/07/09 Software Testing Practice


Statement Coverage
Definition Program Sample
//statement 1
This technique is used to //statement 2
ensure that every statement / If((A > 1) and (B=0))
decision in the program is //sub-statement 1
Else
executed at least once. //sub-statement 2
Test Conditions Description
Statement1
Statement2
Statement coverage requires only
1. (A > 1) and (B = 0) that the if … else statement be
2. (A<=1) and (B NOT = 0) executed once – not that sub-
3. (A<=1) and (B=0)
4. (A>1) and (B NOT= 0) statements 1 and 2 be executed.
 Minimum level of Structural Coverage achieved
 Helps to identify unreachable Code and its removal if required
 “Null else” problem: It does not ensure exercising the
statements completely. Example: ..if x<5 then x= x+3;
07/07/09
x>5 decision not enforced. Paths not covered
Software Testing Practice 19
Branch Coverage
Definition Program Sample
IF Y > 1 THEN Y = Y + 1
A test case design technique in IF Y > 9 THEN Y = Y + 1
ELSE
which test cases are designed to Y=Y+3
END
execute all the outcomes of every Y=Y+2
ELSE
decision Y=Y+4
END
Graph No. Of Paths = 3
3 Test Cases:
1 T F
Y>1 1 (Y > 1) and (Y > 9)
2 (Y > 1) and (Y <= 9)
Y=Y+1 2 3 (Y < = 1)

T F
Y=Y+4
Y>9

Y=Y+1 Y=Y+3

Y=Y+2

07/07/09 Software Testing Practice 20


Branch Coverage Testing
Strengths and Weaknesses

 Considered superior to Statement Testing.

 Solves the “null else” problem of Statement Testing by forcing all the
decisions.

 It does not exercise compound decisions well.


Example: if ((a>5) or (b<10))

 One of the two conditions may never get exercised.

07/07/09 Software Testing Practice 21


Introduction to Cyclomatic Complexity -
Control Graphs
Definition Graph
1

CC is a quantitative measure to 2

identify the logical complexity of


program using control graph 3

4
It is a graphical representation of the
flow of execution. 5

Program Sample Description


void abx(int a, int b, int x)  The numbers represent various
{ program segments in the unit
1 if ( a > 1 ) && ( b == 0 )
2 x = x / a;  A node in the control graph
3 if ( a == 2) || ( x > 1) (circle) represents a segment
4 x = x + 1;  An edge (arrow) indicates all
5 print(x); other segments that can be
} reached from the given
07/07/09 segment
Software Testing Practice 22
Cyclomatic Complexity
Definition Formula for Cyclomatic complexity
CC indicates the upper bound of the C=E-N+2
number of independent paths be
C is Cyclometic complexity
tested in an unit.
Program Sample E is number of edges in the graph
void abx(int a, int b, int x)
{ N is number of nodes in the graph
1 if ( a > 1 ) && ( b == 0 )
2 x = x / a; Graph
3 if ( a == 2) || ( x > 1)
4 x = x + 1; F T
a >1
5 print(x); b == 0
}
x=x/a
Calculation
a==2 T
No. of Edges = 6, No. of Nodes = 5 x>1
F x=x+1
C = 6 - 5 + 2 = 3, No. of Paths = 3
Print x
No. of test cases = 3
07/07/09 Software Testing Practice 23
Condition Coverage - AND

Definition Conditions Table ( 2 n )


 Both parts of the predicate are
tested
 Program Sample shows that all
4 test conditions are tested

Program Sample Test Conditions


If((A > 1) AND (B=0) 1. (A > 1) AND (B = 0)
{
//sub-statement 1 2. (A > 1) AND (B NOT = 0)
} 3. (A<=1) AND (B NOT = 0)
Else 4. (A<=1) AND (B = 0)
{
//sub-statement 2
}
07/07/09 Software Testing Practice 24
Condition Coverage - OR

Definition Conditions Table ( 2 n )


 Both parts of the predicate areA>1 B = 0 RESULT
tested TRUE OR TRUE TRUE
 Program Sample shows that all TRUE OR FALSE TRUE
4 test conditions are tested FALSE OR FALSE FALSE
FALSE OR TRUE TRUE
Program Sample Test Conditions
If((A > 1) OR (B=0) 1. (A > 1) OR (B = 0)
{
//sub-statement 1 2. (A<=1) OR (B NOT = 0)
} 3. (A<=1) OR (B=0)
Else 4. (A>1) OR (B NOT= 0)
{
//sub-statement 2
}
07/07/09 Software Testing Practice 25
Condition Coverage
Decision/Condition Coverage Multiple Condition Coverage
 Combination of decision  All combinations of simple
coverage and condition conditions(true& false
coverage. evaluations)
 The problem of language  For n conditions, 2n
support for condition coverage combinations
persists
 Impractical if conditions are
more, testers tend to confuse

07/07/09 Software Testing Practice 26


Loop Coverage
Loop Coverage Coverage
 Simple  Boundary value tests
 Nested Loops  Cyclomatic Complexity
 Serial / Concatenated Loops
 Unstructured Loops (Goto)

Example of CC
I=1
for ( I=1 ; I<n ; I++ )
F
printf (“Simple Loop”);
I<N

T
E=5 , N=5
END Print
CC = E-N+2
I ++ CC = 2
07/07/09 Software Testing Practice 27
Loop Testing

Simple 
loop
Nested 
Loops
Concatenated       
Loops Unstructured       
Loops
07/07/09 Software Testing Practice 28
Loop Coverage
Concatenate Loop Simple Loops
for (I=1;I<n;I++)
statement 1
 A minimum test is 2 iterations, to
for (k=1;k<n;k++) detect data initialization and use
statement 1 faults
Nested Loop  Must exercise the domain
for (I=1;I<n;I++) boundary of the loop control
for (k=1;k<n;k++)
variable
statement 1
Nested Loops Serial / Concatenated Loops
 Test inner loop first, outer last  more loops in same control path
 Set all outer loop controls to  Define/use data relationship
minimum values
Exists -Treat them as Nested Loops
 Set inner and outer loop controls
Not Exists - Treat them as Simple
to typical values
Loops
07/07/09 Software Testing Practice 29
Data Flow Testing
Data States
 defined(initialized, but not used yet)
 used (value evaluated)
 killed

Test contents
 A statement where a variable is defined (assigned a value)
 A statement where that variable is used with that definition active
 A statement where that variable is used with that definition killed/freed

Features of Data flow Testing


 Each flow path must be tested at least once
 Benefit is to guard against defects where the wrong value is used
 Usually results in more path tests than complete condition coverage

07/07/09 Software Testing Practice 30


Data Flow Testing Example
Program Sample
begin
x=2 Data State - defined
loop
x =x + 1 Data State - used
if (x=5)
then
exit
else
continue
end if
end loop
x=0 Data State - freed
end

07/07/09 Software Testing Practice 31


Functional Testing (Black Box)
 Tests of
Business
Requiremen
ts based on
Inputs Outputs
external
specificatio
ns without
knowledge
of how the
system is
07/07/09 Software Testing Practice constructed 32
Black box Techniques

 Techniques

Inputs Outputs
 Low Level
Techniques
Equivalence
partitioning
Boundary value
analysis
Input and Output
domain
07/07/09 Software Testing Practice 33
Equivalence Partitioning
 Divides the input domain of a
program into classes of data

 Derives test cases based on Valid


these partitions Invalid Inputs
Inputs
 An equivalence class is a set
of valid or invalid states of
input SYSTEM
 Test case design is based on
equivalence classes for an
input domain

Output

07/07/09 Software Testing Practice 34


Equivalence Partitioning
Invalid Valid Range Invalid

2 7 12

Less than 4 Between 4 and 10 More than 10


Input Range [4,10] Test values [2,7,12]

 Useful in reducing the number of Test Cases required


 It is very useful when the input / output domain is amenable to
partitioning

07/07/09 Software Testing Practice 35


Boundary Value Analysis
3 11
4 7 10

Less than 4 Between 4 and 10 More than 10


Input Range [4,10] Test values [3,4,7,10,11]

 A Black Box Testing Method


 Complements to Equivalence partition
 BVA leads to a selection of test cases that exercise bounding
values
 Design test cases Test
 min values of an input
 max values of an input
 just above and below input range
07/07/09 Software Testing Practice 36
Functional vs Structural
Program behavior
Do we
require both?

Functional Structural
establishes seeks faults
confidence

07/07/09 Software Testing Practice 37


What to verify during each build ?
 Interface integrity
◆ Internal and external interfaces are tested as each module (or
cluster) is incorporated into the structure
 Functional validity
◆ Tests designed to uncover functional errors are conducted
 Information content
◆ Tests designed to uncover errors associated with local or global
data structures are conducted
 Performance
◆ Tests designed to verify performance bounds established during
software design are conducted

07/07/09 Software Testing Practice 38


Other types of Testing
Usability testing
Security testing
Smoke Testing
Configuration testing
Compatibility testing
Installation testing
Reliability testing
Documentation testing

07/07/09 Software Testing Practice 39


 Introduction to software testing
 Testing in software lifecycle
 Testing methodologies
 Introduction to testing practices

07/07/09 Software Testing Practice 40


Software Testing Phases
Product
Release

Unit Test Test Lab Environment Test Lab Environment Simulated Production
Environment Environment

Integration System
Unit Test UAT
Test Test

Development

Developers Testers Testers Testers/Users

07/07/09 Software Testing Practice 41


Integration Testing
 Integration testing is a search for component faults that cause Inter-
Definition
component failure

 Integration Tests emphasizes on interaction between modules and


Purpose
interfaces.

Method  White Box and Black Box Testing

Testers
07/07/09 Software Testing Practice 42
Top Down Integration

 Modules are
integrated by
moving
downward

Driver Stub  Verifies major


Module under Tested Module control or
Test
decision points
07/07/09 Software Testing Practice early in the test
43
Bottom-up Integration

 Construction
and testing
with atomic
modules

 Stubs are
Driver Stub NOT needed
Module under Tested Module
Test

07/07/09 Software Testing Practice  Drivers are 44


Integration testing within the system

An Independent System with several Modules

Application System

Module A Module B Module C

Testing of interaction between the modules A, B and C is


known as integration testing within the system

07/07/09 Software Testing Practice 45


Integration testing outside the system

An Independent System Interacting with external systems

External System
A C

B D
Sub System

Testing between the Independent system, External System


and Sub System is Integration testing outside the system

07/07/09 Software Testing Practice 46


System Testing
To check whether a system as a whole conforms to the agreed specification and
Objective to detect the discrepancies between the behavior of the constructed system & its
specification.

Complete application with all Interfaces,


Scope
Performance, Usability, Installation etc

Method Black Box

07/07/09 Software Testing Practice 47


User Acceptance Testing
To test the integrity of of the application by the users in UAT environment. Will
Objective act as a confidence building measure for the users before the product is
released.

Complete application with all interfaces, user


Scope requirements

Method Black Box

07/07/09 Software Testing Practice 48


Testing Workflow
Systems Under Test Business Analysts Testing Team

Product
Management

Successful Smoke Test


Review Test plans
and TC’s
Scope
UAT Coordinator

Test Plan &


Development
Sign-Off Test Cases
Team

Failure

Defects Log

Release
Management Success Schedule &
Execution
07/07/09 Software Testing Practice Test Results 49
Regression Testing
To test the integrity of critical business functions of an
Objective
application after undergoing changes independent of
the underlying software architecture

Testing
method Black box. Automation test tools are used

Renovation Unit test Integration test Regression test

 Does not replace thorough unit testing nor integration testing


 Not a short cut for a proper testing cycle (Unit, integration, System & user test)
 To the extent that it substitutes for integration test or even worse unit testing, it will find
those faults, but at a higher expense of rework to correct the faults
07/07/09 Software Testing Practice 50
Performance Testing
Performance testing is performed at the system level to
Objective confirm that all the performance objectives are met. It’s also
performed for the load handling capacity for the system.

Load Test
Method Stress Test
Volume Test

Phases

Unit Test Integration Test System Performance Test


Test

•Normal or Above normal volumes of transactions can be processed within the expected time frame
•The application system is structurally able to process large amounts of data
•System capacity has enough resources to meet expected turnaround times
•People can actually use the system at peak times.
07/07/09 Software Testing Practice 51
Performance Testing
Load Stress Volume

Evaluation of system Behavior of the system under Performance level of


performance under normal abnormal conditions software product at
conditions optimum load.

Maximum number of users


or transactions High number of users Ability to handle large
number of transactions

Capability of system Low system resources Internal program or system


architecture to handle load limitations are exceeded

System configuration remains Analyses the


Recovery of the system
constant performance for the
when the “pushed over
specific type of test
edge”
data

07/07/09 Software Testing Practice 52


Software Testing - Some Bad Practices
 Testing is cut short in the
 No formal testing
name of economics
techniques used  Easily reachable paths only
are tested
 Poor level of
 No early phase testing like
automation Unit testing and Integration
testing
 No traceability between
 Stubs and drivers are not used
Test cases and
software units
07/07/09 Software Testing Practice 53


References

Software Engineering Roger Pressman


Software Engineering Ian Somerville
Black-Box Testing Boris Beizer
Effective Methods of Software Testing William Perry
Testing Computer Software Cem Kaner
Managing the Testing Process Rex Black

07/07/09 Software Testing Practice 54


Thank You

You might also like