Professional Documents
Culture Documents
SWT2 Tim
SWT2 Tim
4 Test design
5 Management 6 Tools
techniques
Chapter 2
CONTENT
• Test levels
• Test types
• Maintenance testing
Software Development Lifecycle
A software development
lifecycle model describes the
types of activity performed at
each stage in a software
development project, and how
the activities relate to one
another logically and
chronologically.
Characteristics of Good Testing
[in any software lifecycle model]
The analysis & design of tests for a given test level should
begin during the corresponding software development
activity.
Sequential
SDLC
Models
Iterative &
Incremental
Sequential Development Models
User Acceptance
Requirements Testing
Requirements
Software System
Specifications Testing
High-level Integration
Design Testing
Design
Detailed Component
Design Testing
Implementation Testing
Development
V-Model: Late Test Design
User Acceptance
Requirements
Tests Testing
Software System
Tests
Specifications Testing
High-level Integration
Tests
Design Testing
Detailed Component
Design Tests Testing
Implementation
Design
Tests?
V-Model: Early Test Design
Software System
Tests
Specifications Testing
High-level Integration
Tests
Design Testing
Detailed Component
Tests
Design Testing
Implementation
Design Run
Tests Tests
Early test design
Early
Earlytest
testdesign
designhelps
helpsto
tobuild
buildquality,
quality,
stops
stopsfault
faultmultiplication.
multiplication.
VV&T
• Verification
o the process of evaluating a system or component to determine
whether the products of the given development phase satisfy the
conditions imposed at the start of that phase [BS 7925-1]
• Validation
o determination of the correctness of the products of software
development with respect to the user needs and requirements [BS
7925-1]
• Testing
o the process of exercising software to verify that it satisfies specified
requirements and to detect faults
Verification, Validation and Testing
Validation
Testing
Any
Any
Verification
Incremental Development Models
• High-level test planning & test analysis occurs at the onset of the
project. Detailed test planning, analysis, design, and
implementation occurs at the start of each iteration/increment.
• Test execution involves overlapping test levels.
• Many of the same tasks are performed but with varied timing
and extent.
• Common issues
o More regression testing
o Defects outside the scope of the iteration/increment
o Less thorough testing
Rational Unified Process (RUP)
• Test levels
• Test types
• Maintenance testing
(Before planning for a set of tests)
See: Structured Testing, an introduction to TMap®, Pol & van Veenendaal, 1998
High level test planning
2. Introduction
o Software items and features to be tested
o References to project authorisation, project plan, QA plan, CM plan,
relevant policies & standards
3. Test items
o Test items including version/revision level
o How transmitted (net, disc, CD, etc.)
o References to software documentation
4. Features to be tested
• Identify test design specification / techniques
6. Approach
o activities, techniques and tools
o detailed enough to estimate (cost?)
o specify degree of comprehensiveness (e.g. coverage) and other
completion criteria (e.g. faults)
o identify constraints (environment, staff, deadlines)
9. Test Deliverables
• Test plan
• Test design specification
• Test case specification
• Test procedure specification
• Test item transmittal reports
• Test logs
• Test incident reports
• Test summary reports
High-level Test Plan (cont.)
• Lowest level
• Tested in isolation – use of stubs and/or drivers
• Most thorough look at detail
o Error handling
o Interfaces
FAIL
TDD
RE-
PASS
FACTOR
Component
Integration
Integration
Testing
System
Integration
Integration Testing
• In theory:
o if we have already tested components why not just combine them
all at once? Wouldn’t this save time?
o (based on false assumption of no faults)
• In practice:
o takes longer to locate and fix faults
o re-testing after fixes more extensive
o end result? takes more time
Incremental Integration
• Baselines:
o baseline 0: component a
o baseline 1: a + b a
o baseline 2: a + b + c
o baseline 3: a + b + c + d b c
o etc.
• Advantages:
o Critical control structure tested first and most often
o Can demonstrate system early (show working menus)
• Disadvantages:
o Needs stubs
o Detail left until last
o May be difficult to "see" detailed output (but should have been
tested in component test)
o May look more finished than it is
Bottom-up Integration
• Baselines:
o baseline 0: component n a
o baseline 1: n + i
o baseline 2: n + i + o
b c
o baseline 3: n + i + o + d
o etc. d e f g
• Needs drivers to call
the baseline configuration h i j k l m
• Also needs stubs
for some baselines n o
Drivers
• Advantages:
o lowest levels tested first and most thoroughly (but should have
been tested in unit testing)
o good for testing interfaces to external environment (hardware,
network)
o visibility of detail
• Disadvantages
o no working system until last baseline
o needs both drivers and stubs
o major control problems found last
Minimum Capability Integration
(aka. Functional)
• Baselines:
o baseline 0: component a a
o baseline 1: a + b
o baseline 2: a + b + d
b c
o baseline 3: a + b + d + i
o etc. d e f g
• Needs stubs
• Shouldn't need drivers
h i j k l m
(if top-down)
n o
Pros & cons of Minimum Capability
• Advantages:
o Control level tested first and most often
o Visibility of detail
o Real working partial system earliest
• Disadvantages
o Needs stubs
Thread Integration
(also called functional)
• Order of processing some event
determines integration order a
• Interrupt, user transaction b c
• Minimum capability in time
• Advantages: d e f g
o Critical processing first
o Early warning of h i j k l m
performance problems
• Disadvantages: n o
o may need complex drivers and stubs
Integration Guidelines
Test Basis Software & system reqs specs. Risk analysis reports. Use
cases. Epics & user stories. System models. State
diagrams. System & User manuals.
• Done by end-users
• Focus: business processes
• Environment: real / simulated
operational environment
• Aim: to build confidence that
system will enable users to
perform what they need to do with
a minimum of difficulty, cost, and
risk
User acceptance testing
• Approach
o Mixture of scripted and unscripted testing
o "Model Office" concept sometimes used
Why customer / user involvement
• Users know:
o what really happens in business situations
o complexity of business relationships
o how users would do their work using the system
o variants to standard tasks (e.g. country-specific)
o examples of real cases
o how to identify sensible work-arounds
Benefit:
Benefit:detailed
detailedunderstanding
understandingof
ofthe
thenew
newsystem
system
Acceptance Testing: OAT
Test Basis Biz process. User/Biz reqs. Regulations, legal contract &
standards. Use cases. System reqs. System/User
documentation. Risk analysis reports.
Backup & recovery procedures. Disaster recovery plan. Non-
functional reqs. Operations doc. Performance targets. DB
packages. Security standards.
Test SUT. System configuration & config data. Recovery system. Hot
Objects sits. Forms. Reports.
IfIf you
you don't
don't have
have patience
patience to
to test
test the
the
system,
system, thethe system
system will
will surely
surely test
test your
your
patience.
patience.
CONTENT
• Test levels
• Test types
• Maintenance testing
Test Types
Test types
• Functional requirements
o a requirement that specifies a function that a system or system
component must perform (ANSI/IEEE Std 729-1983, Software
Engineering Terminology)
• Functional specification
o the document that describes in detail the characteristics of the
product with regard to its intended capability (BS 4778 Part 2, BS
7925-1)
[1] Functional Testing: Requirements-based
• Business scenarios
o typical business transactions (start to finish)
• Use cases
o prepared cases based on real situations
[1] Functional Testing: Coverage
• Timing Tests
o Response and service times
o Database back-up times
• Concurrency Tests
o Small numbers, large benefits
o Detect record locking problems
• Load Tests
o The measurement of system behaviour under realistic multi-user
load
• Stress Tests
o Go beyond limits for the system - know what will happen
o Particular relevance for e-commerce
• Passwords
• Encryption
• Hardware permission devices
• Levels of access to information
• Authorisation
• Covert channels
• Physical security
Configuration and Installation
• Configuration Tests
o Different hardware or software environment
o Configuration of the system itself
o Upgrade paths - may conflict
• Installation Tests
o Distribution (CD, network, etc.) and timings
o Physical aspects: electromagnetic fields, heat, humidity, motion,
chemicals, power supplies
o Uninstall (removing installation)
Reliability / Qualities
• Reliability
o "System will be reliable" - how to test this?
o "2 failures per year over ten years"
o Mean Time Between Failures (MTBF)
o Reliability growth models
• Other Qualities
o Maintainability, Portability, Adaptability, etc.
Back-up and Recovery
• Back-ups
o Computer functions
o Manual procedures (where are tapes stored)
• Recovery
o Real test of back-up
o Manual procedures unfamiliar
o Should be regularly rehearsed
o Documentation should be detailed, clear and thorough
Documentation Testing
• Documentation review
o check for accuracy against other documents
o gain consensus about content
o documentation exists, in right format
• Documentation tests
o is it usable? does it work?
o user manual
o maintenance documentation
[3] White-box Testing
• Test levels
• Test types
• Maintenance testing
Maintenance testing
• Alternatives
o the way the system works now must be right (except for the specific
change)
o use existing system as the baseline for regression tests
o look in user manuals or guides (if they exist)
o ask the experts - the current users