You are on page 1of 33

SOFTWARE TESTING (SWT301)

TESTING THROUGHOUT THE


SOFTWARE LIFECYCLE (CHAPTER 2)
Contents
• Software Lifecyle Models
• Test Levels
• Test Types
• Maintenance Testing

2 / 32
Software Lifecyle Models
Waterfall Model

3 / 32
Software Lifecyle Models
Iterative & Incremental Model

4 / 32
Software Lifecyle Models
Agile Development

5 / 32
Software Lifecyle Models
Agile Development
The 4 Agile values

6 / 32
Software Lifecyle Models
Agile Methodologies
Scrum Framework
Scrum Events Scrum Team Artifacts
• Sprint • Product Owner • Product Backlog
• Sprint Planning • Development • Sprint Backlog
• Daily Scrum Team • Increment
• Sprint Review • Self orgarnize
• Sprint • Cross functional
Retrospective • Scrum Master

7 / 32
Software Lifecyle Models
Early Test Design
• test design finds faults
• faults found early are cheaper to fix
• most significant faults found first
• faults prevented, not built in
• no additional effort, re-schedule test
design
• changing requirements caused by test
Early test design helps to build quality,
design
stops fault multiplication

8 / 32
Software Lifecyle Models
Verification vs Validation
• Verification
– The process of evaluating a system or
component to determine whether the
products of the given development phase
satisfy the conditions imposed at the start of
that phase
– Do the thing right
• Validation
– Determination of the correctness of the
products of software development with
respect to the user needs and requirements
– Do the right thing
9 / 32
Software Lifecyle Models
V-Model for Testing

?
ing
Te

nn
st

Ru
De
sig

st
Te
n,

&
Ea

n
rly

sig
Te

De
sts

st
10 / 32 Te
SW Development Lifecyle Models
Iterative Development Model

• Involvement of user
representatives in the testing
• Difficult to test due to the
lack of formal documentation
• Changes with the
requirements
• Changes with the working
environment

11 / 32
Test Levels
Overview
Characteristics
of good testing
• Early test design
• Each work-product
is tested
• Testers are
involved in
reviewing
requirements
before they are
released

12 / 32
Test Levels
Typical levels of testing
The term test level provides an indication of the focus of
the testing, and the types of problems it is likely to uncover.
Each of these test levels will include tests designed to
uncover problems specifically at that stage of
development.
The typical levels of testing are:

13 / 32
Test Levels
Component (Unit) Testing
• Units (constructed in isolation, for integration later)
are also called programs, modules or components
• Unit testing is intended to ensure that the code
written for the unit meets its specification, prior to
its integration with other units.
• Objectives:
– checking conformance to the program specification
– verifies that all of the code that has been written for the
unit can be executed
• Bases: the component requirements; the detailed
design; the code itself
• Test objects: the components, the programs, data
conversion/migration programs and database
modules
•14 /Performer:
32 the developer who wrote the code
Test Levels
Integration Testing
• Integration is to put the units together to create
the system one they have been written.
• Objectives: expose defects in the interfaces & in
the interactions between integrated components
or systems
• Bases: software & system design, sequence
diagrams, workflows, use cases, interface
specifications,..
• Test objects: subsystems, databases, APIs,
interfaces microservices, infrastructure,..
• Performers: independent developers (component
integrations), testers (system integrations)
15 / 32
Test Levels
Integration Testing
Stubs vs Drivers
• Stub replaces a called
component for integration
testing
• Keep it Simple • Driver replaces a calling
component for integration
– print/display name (I have
testing
been called)
• Specially written or general
– reply to calling module
purpose (commercial tools)
(single value)
– invoke baseline
– computed reply (variety of
values) – send any data baseline
expects
– prompt for reply from tester
– receive any data
– search list of replies
baseline produces
– provide timing delay (print)
• each baseline has different
requirements from the test
16 / 32
driving software
Test Levels
Integration Testing
Big-Bang
Integration
• In theory:
– if we have already tested components why
not just combine them all at once? Wouldn’t
this save time?
– (based on false assumption of no faults)
• In practice:
– takes longer to locate and fix faults
– re-testing after fixes more extensive
– end result? takes more time

17 / 32
Test Levels
Integration Testing
Incremental Integration
• Baseline 0: tested component
• Baseline 1: two components
• Baseline 2: three components, etc.
• Advantages:
– easier fault location and fix
– easier recovery from disaster / problems
– interfaces should have been tested in
component tests, but ..
– add to tested baseline
18 / 32
Test Levels
Integration Testing
Incremental Integration: Top-Down
• Baselines:
– baseline 0: component
a a
– baseline 1: a + b
– baseline 2: a + b + c b c
– baseline 3: a + b + c +
d d e f g
– etc.
• Need to call to lower h i j k l m
level components not
yet integrated n o
• Stubs: simulate
19 /missing
32
Test Levels
Integration Testing
Incremental Integration: Bottom-Up
• Baselines: a
– baseline 0: component n
– baseline 1: n + i b c
– baseline 2: n + i + o
– baseline 3: n + i + o + d d e f g
– etc.
h i j k l m
• Needs drivers to call
the baseline configuration
n o
• Also needs stubs
for some baselines
20 / 32
Test Levels
Integration Testing
Incremental Integration: Minimum
Capability Integration (Functional)
• Baselines: a
– baseline 0: b c
component a
– baseline 1: a + b d e f g
– baseline 2: a + b + d
– baseline 3: a + b + d +h i j k l m
i
– etc.
n o
• Needs stubs
• Shouldn't need
drivers
21 / 32
Test Levels
Integration Testing
Incremental Integration: Thread
Integration (Functional)
• order of processing
some event
a
determines
integration order b c
• interrupt, user
d e f g
transaction
• minimum capability inh i j k l m
time
n o
22 / 32
Test Levels
System Testing
• Having checked that the components all work
together at unit level, the next system test stage is to
– To consider the functionality from an end-to-end
perspective.
– Focusing on the behavior of the whole system/product as
defined by the scope of a development project or program
• Main objectives:
– Validating that the system is complete and will work as
expected
– Verifying whether or not functional & non- functional
behaviors of they system are as they should be (as
specified)
• Main bases: system and software requirement
specifications; use cases; functional specifications,..
•23 /Test
32
objects: generally be the system under test.
Test Levels
System Testing
• Last integration step
• Functional
– Functional requirements: specification for functions
that a system or system component must perform
– Requirements-based testing: uses specification of
requirements as the basis for identifying tests
– Business process-based testing:
• User expectations: what will be used most often/critical to the
biz?
• Business scenarios: typical business transactions (birth to
death)
• Use cases: prepared cases based on real situations
• Non-functional
– as important as functional requirements
– often poorly specified
24 / 32
– must be tested
Test Levels
Acceptance Testing
• Produce information to access the system’s
readiness for release or deployment to end-users
or customers.
• The focus is on validation, the use of the system
for real, how suitable the system is to be put into
production.
• Main objectives: provide the end users with
confidence that the system will function
according to their expectations
• Main bases: business processes, user or
business requirements, use cases, system
requirements,..
• Test objects: generally be the system under test.
•25 /Performers:
32 customers or users of the system
Test Levels
Acceptance Testing
Typical forms of acceptance testing
• User acceptance testing
• Operational acceptance testing
• Contract and regulation acceptance
testing
• Alpha and beta testing
– Alpha testing takes place at the developer’s
site
– Beta testing takes place at the customer’s site

26 / 32
Test Levels
Comparisons 1/2

27 / 32
Test Levels
Comparisons 2/2

28 / 32
Test Types
• Functional testing: testing of function
– Requirement-based testing
– Business-process-based testing
– Focus: suitability, interoperability testing, security,
accuracy and compliance
• Non-functional testing: testing of software
product characteristics
– Functionality, reliability, usability, efficiency,
maintainability, portability
• White-box / structural testing: testing of
software structure/architecture
• Testing related to changes: confirmation and
regression testing
29 / 32
Maintenance Testing
• Testing the changes to an operational system or
the impact of a change environment to an
operational system. Possible triggers
– Modifications
– Migration
– Retirement
• Testing to preserve quality:
– different sequence
• development testing executed bottom-up
• maintenance testing executed top-down
• different test data (live profile)
– breadth tests to establish overall confidence
– depth tests to investigate changes and critical areas
– predominantly regression testing

30 / 32
Maintenance Testing
Why maintenance testing?
During or after deploying system to the
production, it may become necessary to
change the system as
• Additional features being required.
• The system being migrated to a new
operating platform.
• The system being retired – data may need to
be migrated or archived.
• Planned upgrade to commercial off-the-shelf
(COTS)-based systems.
• New faults being found requiring fixing (these
can be ‘hot fixes’).
31 / 32
Maintenance Testing
What to test in maintenance testing?
• Test any new or changed code
• Impact analysis
– what could this change have an impact on?
– how important is a fault in the impacted area?
– test what has been affected, but how much?
• most important affected areas?
• areas most likely to be affected?
• whole system?
• The answer: “It depends”

32 / 32

You might also like