You are on page 1of 57

UNIT III

Integration Testing: Definition, As a type of


testing: Top-down integration, Bottom-up
integration, Bidirectional integration, System
integration, Choosing integration method, As a
phase of testing, Scenario testing: System
scenarios, Use case scenarios, Defect bash.
Debugging vs. Testing
• Debugging is the process of finding errors in a
program under development that is not thought to
be correct [fix errors]
􀀀
• Testing is the process of attempting to find errors in
a program that is thought to be correct. Testing
attempts to establish that a program satisfies its
• Specification [prevent errors]
• 􀀀
• Testing can establish the presence of errors but
cannot guarantee their absence (E.W. Dijkstra)
• 􀀀

7/19/2023 2
Debugging vs. Testing
• Exhaustive testing is not possible for real programs
due to combinatorial explosion of possible test
cases

• Amount of testing performed must be balanced


against the cost of undiscovered errors
􀀀
• Regression Testing is used to compare a modified
version of a program against a previous version

7/19/2023 3
when to stop testing ?
• Deadlines (release deadlines, testing deadlines,
etc.)
• Test cases completed with certain percentage
passed
• Test budget depleted
• Coverage of code/functionality/requirements
reaches a specified point
• Bug rate falls below a certain level
• Beta or alpha testing period ends
7/19/2023 4
Toy Testing vs. Professional Testing
Toy Testing
– Small quantity of test data
– Test data selected haphazardly
– No systematic test coverage
– Weak analysis of test output
􀀀
Professional Testing
– Large quantity of test data
– Systematic selection of test cases
– Test coverage evaluated and improved.
– Careful evaluation of test output
– Heavily automated using scripts and testing tools.

7/19/2023 5
Unit Testing
 Validate a unit of code; smallest testable part
• Executes the code in a sandboxed
environment
• Testing mostly for functional requirements
• can also test some non-functional requirements
• Many approaches and schools of thought
History Based, Risk Based, Data Path, DOE
Unit Testing
• Testing at the module level
• 􀀀
• Test information flows across module interface
• 􀀀
• Test module’s handling of its local data structure
• 􀀀
• Test boundary conditions for input
• 􀀀
• Test all control flow paths in the module
• 􀀀

7/19/2023 7
Unit Testing
• Execute all statements at least once
• 􀀀
• Test error-handling and response to bad input
• 􀀀
• Develop test cases for all of the above􀀀
• Write driver program to read test cases and call
module
• Write stubs to simulate modules called by
module under test

7/19/2023 8
Unit Testing

Objective: Find differences between specified units and their imps.


Unit: component ( module, function, class, objects, …)
Unit test environment:

Driver Test cases


Test result

Unit under Effectiveness?


• Partitioning
test
• Code coverage

Stub Stub Dummy modules


7/19/2023 9
Unit testing
Objectives  To test the function of a program or unit of
code such as a program or module
 To test internal logic
 To verify internal design
 To test path & conditions coverage
 To test exception conditions & error
handling
When  After modules are coded
Input  Internal Application Design
 Master Test Plan
 Unit Test Plan
Output  Unit Test Report
Who Developer

Methods White Box testing techniques


Test Coverage techniques

Tools Debug
Re-structure
Code Analyzers
Path/statement coverage tools
Education Testing Methodology
Effective use of tools
Integration Testing
• Objectives:
• To expose problems arising from the combination
• To quickly obtain a working solution from comps.
• Problem areas
– Internal: between components
• Invocation: call/message passing/…
• Parameters: type, number, order, value
• Invocation return: identity (who?), type, sequence
– External:
• I/O timing

7/19/2023 12
Integration Testing
• Incremental vs. non-incremental system
integration
• Incremental is almost always easier, more
effective and less chaotic

7/19/2023 13
Incremental integration testing

A T1

T1
A
T1 T2
A B
T2

T2 B T3

T3
B C
T3 T4
C
T4

D T5

Testsequence 1 Testsequence 2 Testsequence 3

7/19/2023 14
Integration Testing
(Behavioral: Path-Based)

A B C

MM-path: Interleaved sequence of module exec path and messages


Module exec path: entry-exit path in the same module

7/19/2023 15
Approaches to Incremental
Integration
Top Down - Start with main program module,
gradually add subordinate modules.
• Could use breadth-first or depth-first strategy.
• Test system as each new modules is added.
• Regression test previous models as required
Bottom Up - Start with atomic (leaf) modules.
• Use drivers to tests clusters of modules.
• Gradually add calling modules and move testing
upward

7/19/2023 16
Top-Down Integration
Advantages
• Verifies major control and decision
functions early in testing
• This is a big win if there are major control
problems.
• Depth-first strategy makes parts of
system available early for demo.

7/19/2023 17
Top-Down Integration
Disadvantages
• Building stubs adequate to test upper
levels may require a lot of effort
• May not be able to do complete top-down
integration
• A lot of effort (later discarded) is put into
designing and implementing drivers and
stubs
7/19/2023 18
Bottom-Up Integration
Advantages
• Stubs aren’t required
Disadvantages
• – Entire system isn’t available until end of
integration
• – Won’t detect major control/data/interface
problems until relatively late in the testing
process

7/19/2023 19
Practical Integration Testing
 Identify critical modules and test them early
• Critical modules:
– Address multiple software requirements
– Have a high level of control
– Is particularly complex or error-prone
– Have definite performance requirements
– Have a high coupling with other modules
– Appear to be potential performance bottle-necks
– Are on the critical path of the project
7/19/2023 20
Practical Integration Testing
Use sandwich testing as a compromise
between top-down and bottom-up

– Use top-down strategy for upper levels of the


program
– Use bottom-up strategy of lower levels of the
program
– Carefully meet in the middle

7/19/2023 21
Types of Integration Testing

»Big Bang testing

»Top Down Integration testing

»Bottom Up Integration testing


Integration testing
Objectives  To technically verify proper
interfacing between modules, and
within sub-systems
When  After modules are unit tested
Input  Internal & External Application
Design
 Master Test Plan
 Integration Test Plan
Output  Integration Test report
Who Developers

Methods White and Black Box


techniques
Problem /
Configuration
Management
Tools Debug
Re-structure
Code Analyzers
Education Testing Methodology
Effective use of tools
System Testing
• Higher level tests to determine suitability
and performance of entire system
(i.e. hardware, software and procedures)
• The execution of a given test case against
program P will address (cover) certain
requirements P. A measure of testedness
for P is the degree of requirements
coverage produced by the collective set of
test cases for P.
7/19/2023 25
Types of System Testing
• Volume Testing : to determine whether the
program can handle the required volumes
of data, requests,etc.
• Load/Stress Testing : to identify peak load
conditions at which program will fail to
handle required processing loads within
required time span

7/19/2023 26
Types of System Testing
• Stress testing: push it to its limit +
beyond
Volume

Users
Application response
: (System)
rate

Resources: phy. + logical

7/19/2023 27
Types of System Testing
• Security Testing : to show that the program’s
security requirements can be subverted
• Test for the following
 Authentication Methods
 Limited Access on Need-to-know basis
 Controls over physical access
 Activity Logs
 UID & Time Stamping

7/19/2023 28
Types of System Testing
• Usability(human factors) Testing : to
identify those operations that will be
difficult or inconvenient for users.
Publications,facilities, and manual
procedures are tested

7/19/2023 29
Types of System Testing
• Performance Testing : to determine
whether the program meets its
performance requirements.
• Resource usage Testing : to determine
whether the program uses resources at
levels which exceed requirements.

7/19/2023 30
Types of System Testing
• Configuration Testing : to determine
whether the program operates properly
when the software or hardware is
configured in a required manner
Server H/W configuration
Client H/W configuration
S/W Configuration : Server
S/W Configuration : Clients
Network Configuration
7/19/2023 31
Types of System Testing
• Compatibility/Conversion Testing : to
determine whether the compatibility
objectives of the program have been met
& whether the conversion procedure works
• For switch-over from one system to
another
• Data conversion should be an automated
process

7/19/2023 32
Types of System Testing
• Installation Testing : to identify the ways in
which the installation procedures lead to
incorrect results
Installation Manual (IM) should be test
checked
Must be automated process
IM must be accompanied by an Installation
Check-list cum Certificate (ICC)
7/19/2023 33
Types of System Testing
• Recovery Testing : to determine whether
the system or program meets its
requirements for recovery after a failure
Backup Media
Online / Offline Backup
Timing Backup
Restore

7/19/2023 34
Types of System Testing
• Serviceability Testing : to identify
conditions whose serviceability need will
not meet requirements

7/19/2023 35
Types of System Testing
• Reliability/availability Testing : to
determine whether the system meets its
reliability & availability requirements (24x7)
Acceptable & expected manner of
handling abnormal conditions
No surprise, no unexpected behaviour

7/19/2023 36
System Testing
Objectives  To verify that the system components perform
control functions
 To perform inter-system test
 To demonstrate that the system performs both
functionally and operationally as specified
 To perform appropriate types of tests relating
to Transaction Flow, Installation, Reliability,
Regression etc.
When  After Integration Testing
Input  Detailed Requirements & External Application
Design
 Master Test Plan
 System Test Plan
Output  System Test Report
Who Development Team and Users

Methods Problem
/ Configuration
Management

Tools Recommended set of tools

Education Testing Methodology


Effective use of tools
Systems Integration Testing
Objectives  To test the co-existence of products and
applications that are required to perform
together in the production-like operational
environment (hardware, software, network)
 To ensure that the system functions together
with all the components of its environment as a
total system
 To ensure that the system releases can be
deployed in the current environment
When  After system testing
 Often performed outside of project life-cycle
Input  Test Strategy
 Master Test Plan
 Systems Integration Test Plan
Output  Systems Integration Test report
Who System Testers

Methods White and Black Box techniques


Problem / Configuration
Management
Tools Recommended set of tools

Education Testing Methodology


Effective use of tools

You might also like