Software Testing
* The process of executing a program or
application with the intent of finding
software bugs (errors or other defects), and
verifying that the software product is fit for
use.OBJECTIVES
The student should be made to:
* Expose the criteria for test cases.
* Learn the design of test cases.
¢ Be familiar with test management and test
automation techniques.
¢ Be exposed to test metrics and
measurements.UNIT-1
INTRODUCTION
Testing as an Engineering Activity — Testing as a
Process — Testing axioms — Basic definitions —
Software Testing Principles — The Tester“s Role
in a Software Development Organization —
Origins of Defects — Cost of defects — Defect
Classes — The Defect Repository and Test Design
— Defect Examples — Developer/Tester Support
of Developing a Defect Repository — Defect
Prevention strategies.Unit-ll
TEST CASE DESIGN
Test case Design Strategies — Using Black Box
Approach to Test Case Design — Random Testing —
Requirements based testing — Boundary Value
Analysis — Equivalence Class Partitioning — State-
based testing — Cause-effect graphing — Compatibility
testing — user documentation testing — domain testing
— Using White Box Approach to Test design — Test
Adequacy Criteria — static testing vs. structural testing
— code functional testing — Coverage and Control Flow
Graphs — Covering Code Logic — Paths — code
complexity testing — Evaluating Test Adequacy
Criteria.Unit-IIl
LEVELS OF TESTING
The need for Levels of Testing — Unit Test — Unit Test
Planning — Designing the Unit Tests — The Test Harness
— Running the Unit tests and Recording results —
Integration tests — Designing Integration Tests —
Integration Test Planning — Scenario testing — Defect
bash elimination System Testing — Acceptance testing —
Performance testing — Regression Testing —
Internationalization testing — Ad-hoc testing — Alpha,
Beta Tests — Testing OO systems — Usability and
Accessibility testing — Configuration testing —
Compatibility testing — Testing the documentation —
Website testing.UNIT IV
TEST AMANAGEMENT
People and organizational issues in testing —
Organization structures for testing teams —
testing services — Test Planning — Test Plan
Components — Test Plan Attachments —
Locating Test Items — test management — test
process — Reporting Test Results — The role of
three groups in Test Planning and Policy
Development — Introducing the test specialist
— Skills needed by a test specialist — Building a
Testing Group.UNIT-V
TEST AUTOMATION
Software test automation — skill needed for
automation — scope of automation — design
and architecture for automation —
requirements for a test tool — challenges in
automation — Test metrics and measurements
— project, progress and productivity metrics.TEXT BOOKS:
1. Srinivasan Desikan and Gopalaswamy Ramesh,
“Software Testing — Principles and Practices”, Pearson
Education, 2006.
2. Ron Patton, “ Software Testing”, Second Edition, Sams
Publishing, Pearson Education, 2007.
REFERENCES:
1. Ilene Burnstein, “ Practical Software Testing”, Springer
International Edition, 2003.
2. Edward Kit,” Software Testing in the Real World —
Improving the Process”, Pearson Education, 1995.
3. Boris Beizer,” Software Testing Techniques” — 2nd
Edition, Van Nostrand Reinhold, New York, 1990.
4. Aditya P. Mathur, “Foundations of Software Testing _
Fundamental Algorithms and Techniques”, Dorling
Kindersley (India) Pvt. Ltd., Pearson Education, 2008.Testing as an
Engineering ActivityPoor quality software that can cause loss of
life or property is no longer acceptable to
society.
Failure can result in catastrophic losses.
Highly qualified staff make sure that software
products are built on time, within budget, and
are of the highest quality
IEEE & ACM - code of ethics for the software
engineering disciplineUsing an engineering approach to software
development means the following:
The development of the process is well understood.
Projects are planned.
Life cycle models are defined and adhered to.
Standards are in place for product and process
Measurements are employed to evaluate product and
process quality.
Components are reused.
Validation and verification processes play a key role in
quality determination;
Engineers have proper education, training, and
certification.* A test specialist is one whose education is based
on the principles, practices and processes that
constitute the software engineering discipline
and whose specific focus is on one area of that
discipline, software testing.
Knowledge on the following
* Test related principles,
¢ Processes,
* Measurements,
* Standards,
* Plans,
* Tools and methods
* How to apply them to the testing tasks.Role of process in software quality
Process- the set of methods, practices,
standards, documents, activities, policies,
and procedures that software engineers use
to develop and maintain a software system
and its associated artifacts, such as project
and test plans, design documents, code, and
manuals.Activities
Policies ‘Standards
and documents:
mo LY me
Engincered
——— Process, ~
Methods and ese ee Procedures
techniques,
| Process evolution
Version
2.0
FIG. 1.2
Components of an engineered process.Testing as a process
Software Development Process
Software Development Process
Validation
ProcessTesting as a process
Validation is the process of evaluating a software
system or component during, or at the end of,
the development cycle in order to determine
whether it satisfies specified requirements.
Verification is the process of evaluating a
software system or component to determine
whether the products of a given development
phase satisfy the conditions imposed at the start
of that phaseTesting Axioms
¢ An axiom — A sentence or proposition that is
not proved and considered as obvious for
theory building and acceptationTesting Axioms
. THE STAKEHOLDER AXIOM: testing needs
stakeholders
. THE VALUE AXIOM: the value of intelligence is
independent of who produce it
. THE SCOPE MANAGEMENT AXIOM: if we don’t
manage scope, we may never meet stakeholder
expectations
. THE GOOD-ENOUGH AXIOM: the scope of
testing and acceptance are always compromises5. TEST BASIS AXIOM: testers need sources of
knowledge to select things to test
6. THE ORACLE AXIOM: testers need sources of
knowledge to evaluate actual outcomes or
behaviours
7.THE COVERAGE AXIOM: testing needs a test
coverage model or models
8. THE PRIORITISATION AXIOM: prioritizing the tests
needs a mechanism for ordering tests by value9. THE FALLIBILITY AXIOM: our sources of
knowledge are fallible and incomplete(Ready for
retesting)
10. THE REPEAT-TEST AXIOM: some repeated tests
are unavoidable
11. THE EXECUTION SEQUENCING AXIOM: run our
most valuable tests first - we may not have time
to run them later
12. THE ENVIRONMENT AXIOM: test execution
requires a known, controlled environment13. THE EVENT AXIOM: testing never goes as
planned; evidence arrives in discrete quanta
14. THE EXECUTION SEQUENCING AXIOM: run
our most valuable tests first - we may not
have time to run them later
15.The design axiom:Test design is based on
models
16.The delivery axiom:Intelligence produced by
test determines the value of testingBasic definitions
Errors: An error is a mistake, misconception, or
misunderstanding on the part of a software developer.
Faults (Defects) :A fault (defect) is introduced into the
software as the result of an error. It is an anomaly in
the software that may cause it to behave incorrectly,
and not according to its specification.
Failures: A failure is the inability of a software system or
component to perform its require functions within
specified performance requirements.A test case :A test-related item which contains
the following information:
1. A set of test inputs.
2. Execution conditions.
3. Expected outputs.Test :A test is a group of related test cases and
test Procedures.
Test Oracle:A test oracle is a document, or piece
of software that allows testers to determine
whether a test has been passed or failed.
Test Bed: A test bed is an environment that
contains all the hardware and software
needed to test a software component or a
software system.Quality -the degree to which a system, system
component, or process meets specified
requirements.
Metric:A metric is a quantitative measure of the
degree to which a system, system component, or
process possesses a given attribute.
The software quality assurance (SQA) group is a
team of people with the necessary training and
skills to ensure that all necessary actions are
taken during the development process so that the
resulting software conforms to established
technical requirements.
REVIEW: A review is a group meeting whose
purpose is to evaluate a software artifact or a set
of software artifacts.Principles
Principle 1. Testing is the process of exercising a
software component using a selected set of
test cases, with the intent of (i) revealing
defects, and (ii) evaluating quality.
Principle 2. When the test objective is to detect
defects, then a good test case has a high
probability of revealing undetected defect(s).Principle 3. Test results should be inspected
meticulously.
Principle 4. A test case must contain the
expected output or result.
Principle 5. Test cases should be developed for
both valid and invalid input conditions.
Principle 6. The probability of the existence of
additional defects in a software component is
proportional to the number of defects already
detected in that component.Principle 7. Testing should be carried out by a
group that is independent of the development
group.
Principle 8. Tests must be repeatable and
reusable.
Principle 9. Testing should be planned.
Principle 10. Testing activities should be
integrated into the software life cycle.
Principle 11. Testing is a creative and challenging
task.The Tester’s Role in a Software
Development Organization
1.The tester’s job is to
Reveal defects,
Find weak points,
Inconsistent behaviour,
Circumstances where the software does not work
as expected.
. Atester needs very good programming
experience in order to understand how code is
constructed, and to know where and what types
of, defects could occur.. work with the developers to produce high-quality
software that meets the customers’
requirements.
. Atesters also need to work with requirements
engineers to make sure that requirements are
testable, and to plan for system and acceptance
test.
. Testers also need to work with designers to plan
for integration and unit test.
. Testers also need to cooperate with software
quality assurance staff and software engineering
process group members.
. Testers need the support of management.¢ Testers are specialists, their main function is to
plan, execute, record, and analyse tests.
¢ They do not debug software.
* When defects are detected during testing,
software should be returned to the
developers.ORIGINS OF DEFECTS
* Defects have negative effects on software use.
Software engineers work very hard to produce
high-quality software with a low number of
defects.Education: The software engineer did not
have the proper educational background to
prepare the software artifact.
Communication: The software engineer was
not informed about something by a colleague.
Oversight: The software engineer omitted to
do something.
Transcription: The software engineer knows
what to do, but makes a mistake in doing it.
Process: The process used by the software
engineer misdirected tester’s actionsA tester develops hypotheses about possible
defects. Test cases are then designed based on
the hypotheses. The hypotheses are used to,
Design test cases.
Design test procedures.
Assemble test sets.
Select the testing levels suitable for the tests.
Evaluate the results of the tests.1. Fault Model
A fault (defect) model can be described as a
link between the error made, and the
fault/defect in the software.
2. Defect Repository
To increase the effectiveness of their testing
and debugging processes, software
organizations need to initiate the creation of a
defect database, or defect repository. The
defect repository supports storage and
retrieval of defect data from all projects ina
centrally accessible location.DEFECT CLASSES
The four classes of defects are as follows,
* Requirements and specifications defects,
* Design defects,
* Code defects,
* Testing defects1. Requirements and Specifications Defects
The beginning of the software life cycle is
important for ensuring high quality in the
software being developed. Defects injected in
early phases can be very difficult to remove in
later phases. Since many requirements
documents are written using a natural language
representation, they may become
Ambiguous,
Contradictory,
Unclear,
Redundant,
Imprecise.1.1 Functional Description Defects
¢ The overall description of what the product
does, and how it should behave
(inputs/outputs), is incorrect, ambiguous,
and/or incomplete.1.2 Feature Defects
¢ Features is described as distinguishing
characteristics of a software component or
system. Feature defects are due to feature
descriptions that are missing, incorrect,
incomplete, or unnecessary.1.3 Feature Interaction Defects
* These are due to an incorrect description of how
the features should interact with each other.
1.4 Interface Description Defects
* These are defects that occur in the description of
how the target software is to interface with
external software, hardware, and users.2. Design Defects
Design defects occur when the following are
incorrectly designed,
* System components,
* Interactions between system components,
¢ Interactions between the components and
outside software/hardware, or users2.1 Algorithmic and Processing Defects
* These occur when the processing steps in the algorithm as
described by the pseudo code are incorrect.
2.2 Control, Logic, and Sequence Defects
* Control defects occur when logic flow in the pseudo code is
not correct.
2.3 Data Defects
* These are associated with incorrect design of data
structures.
2.4 Module Interface Description Defects
* These defects occur because of incorrect or inconsistent
usage of parameter types, incorrect number of parameters
or incorrect ordering of parameters.2.5 Functional Description Defects
* The defects in this category include incorrect,
missing, or unclear design elements.
2.6 External Interface Description Defects
* These are derived from incorrect design
descriptions for interfaces with COTS(commercial
off-the-shelf) components, external software
systems, databases, and hardware devices.3. Coding Defects
* Coding defects are derived from errors in
implementing the code. Coding defects classes
are similar to design defect classes. Some
coding defects come from a failure to
understand programming language
constructs, and miscommunication with the
designers.3.1 Algorithmic and Processing Defects
* Code related algorithm and processing defects
include
* Unchecked overflow and underflow conditions,
* Comparing inappropriate data types,
* Converting one data type to another,
* Incorrect ordering of arithmetic operators,
* Misuse or omission of parentheses,
* Precision loss,
* Incorrect use of signs.3.2 Control, Logic and Sequence Defects
* This type of defects include incorrect
expression of case statements, incorrect
iteration of loops, and missing paths.
3.3 Typographical Defects
¢ These are mainly syntax errors, for example,
incorrect spelling of a variable name that are
usually detected by a compiler or self-reviews,
or peer reviews.3.4 Initialization Defects
* This type of defects occur when initialization
statements are omitted or are incorrect. This may
occur because of misunderstandings or lack of
communication between programmers, or
programmer's and designer's, carelessness, or
misunderstanding of the programming environment.
3.5 Data-Flow Defects
* Data-Flow defects occur when the code does not
follow the necessary data-flow conditions.
3.6 Data Defects
* These are indicated by incorrect implementation of
data structures.3.7 Module Interface Defects
* Module Interface defects occurs because of using
incorrect or inconsistent parameter types, an
incorrect number of parameters, or improper
ordering of the parameters.
3.8 Code Documentation Defects
* When the code documentation does not describe
what the program actually does, or is incomplete
or ambiguous, it is called a code documentation
defect.3.9 External Hardware, Software Interfaces Defects
These defects occur because of problems related to
* System calls,
* Links to databases,
* Input/output sequences,
* Memory usage,
¢ Resource usage,
* Interrupts and exception handling,
* Data exchanges with hardware,
* Protocols,
* Formats,
* Interfaces with build files,
* Timing sequences.4. Testing Defects
* Test plans, test cases, test harnesses, and test
procedures can also contain defects. These
defects are called testing defects. Defects in
test plans are best detected using review
techniques.4.1 Test Harness Defects
A test harness or automated testframework is a
collection of software and test data configured
to test a program unit by running it under varying
conditions and monitoring its behavior and
outputs. It has two main parts: the test execution
engine and thetest script repository.
* In order to test software, at the unit and
integration levels, auxiliary code must be
developed. This is called the test harness or
scaffolding code. The test harness code should be
carefully designed, implemented, and tested
since it is a work product and this code can be
reused when new releases of the software are
developed.4.2 Test Case Design and Test Procedure Defects
¢ These consists of incorrect, incomplete,
missing, inappropriate test cases, and test
procedures.Defect Example: The Coin Problem
Specification for the program calculate_coin_value
This program calculates the total rupees value for a
set of coins. The user inputs the amount of 25p,
50p and irs coins. There are size different
denominations of coins. The program outputs the
total rupees and paise value of the coins to the
user
Input :number_of_coins is an integer
Output :number_of_rupees is an integer
number_of_paise is an integer1. Design Description for the Coin Problem
Design Description for Program calculate_coin_values
Program calculate_coin_values
number_of_coins is integer
total_coin_value is integer
number_of_rupees is integer
number_of_paise is integer
coin_values is array of six integers representing
each coin value in paise
initialized to: 25,25,100
in
jalize total_coin_value to zero
initialize loop_counter to one
while loop_counter is less than six
begin
output “enter number of coins”
read (number_of_coins )
total_coin_value = total_coin_value +
number_of_coins * coin_value[loop_counter]
increment loop_counter
end
number_rupees = total_coin_value/100
number_of_paise = total_coin_value — 100 * number_of_rupees
ourpue (number_of_rupees, number_of_paise)
en2. Design Defects in the Coin Problem
Control, logic, and sequencing defects. The defect in this
subclass arises from an incorrect “while” loop condition
(should be less than or equal to six)
Algorithmic, and processing defects. These arise from the
lack of error checks for incorrect and/or invalid inputs, lack
of a path where users can correct erroneous inputs, lack of
a path for recovery from input errors.
Data defects. This defect relates to an incorrect value for
one of the elements of the integer array, coin_values,
which should be 25, 50, 100.External interface description defects. These are
defects arising from the absence of input messages or
prompts that introduce the program to the user and
request inputs.
. Coding Defects in the Coin Problem
Control, logic, and sequence defects. These include
the loop variable increment step which is out of the
scope of the loop. Note that incorrect loop condition
(i<6) is carried over from design and should be counted
as a design defect.
Algorithmic and processing defects. The division
operator may cause problems if negative values are
divided, although this problem could be eliminated
with an input check.
Data Flow defects. The variable total_coin_value is not
initialized. It is used before it is defined.* Data Defects. The error in initializing the array
coin_values is carried over from design and
should be counted as a design defect.
« External Hardware, Software Interface
Defects. The call to the external function “scanf”
is incorrect. The address of the variable must be
provided.
* Code Documentation Defects. The
documentation that accompanies this code is
incomplete and ambiguous. It reflects the
deficiencies in the external interface description
and other defects that occurred during
specification and design.Developer/Tester Support for Developing a
Defect Repository
A requirement for repository development
should be a part of testing and/or debugging
policy statements.
Forms and templates should be designed to
collect the data.
Each defect and frequency of occurrence must
be recorded after testing.
Defect monitoring should be done for each
on-going project.Supports TMM maturity goals
‘Test process
ent improvement* The defect data is useful for test planning. It is
a TMM level 2 maturity goal. It helps a tester
to select applicable testing techniques, design
the test cases, and allocate the amount of
resources needed to detect and remove
defects. This allows tester to estimate testing
schedules and costs.
The defect data can support debugging
activities also.A defect repository can help in implementing
several TMM maturity goals including
* Controlling and monitoring of test,
* Software quality evaluation and control,
* Test measurement,
* Test process improvement.DEFECT PREVENTION STRATEGIES
The purpose of Defect Prevention is to identify
the Root cause of defects and prevent them
from recurring.
This involves analyzing defects that were
encountered in the past and taking specific
actions to prevent the occurrence of those
types of defects in the future.
It also enhances the Productivity.
It Reduces rework effort.Methods of Defect Preventions
* Reviews & Inspections: Self-Review, Peer Review
& Inspections.
* Walkthroughs: prototyping of the actual design
that gives the you the basic idea of the product
functionality along with its look & feel.
* Defect Logging and Documentation: provide key
parameters that supports Defect Analysis and
Measurements.
* Root Cause Analysis.
* Fishbone Analysis.Targeting Process Improvemente__— sun.cause
Sub-CauseDetects found at
foe a mm) Boy
Ms Wt
Selectiondone by
i sm | Selection of defects under ench phase for Causal Analysis
Causal Analyats/Ientification of Root Causes of Selected Defects by
Quality Group
Defect Prevention (DP) Action Proposedby
Quality Grow
Luplenentation of DP Actions by
Q-GoDis an independent
team comprising of Domain
/ Technical & Quality
etaSOFTWARE DEFECT FRAMEWORK
The software defect framework highlighting the 5 Ds of defect
origin is proposed in this work.
Each one of the D’s concentrates on defects in one particular
stage of software development lifecycle like Requirements,
Design, Coding, Testing and defects due to timeline
problem.
After doing a thorough analysis of various defect types under
each stage of software development, the most prominent
defects are identified.
The defects are then prioritized based on their importance
and the top six defects from each category are taken for study.Certain type of these defects are marked with Explicit (E)
signifies influencing the process and certain other
defect types are marked with Implicit (I) to signify
those defects that are reported by customer.
Utmost care should be given to such defect types in order
to satisfy customers. For each one of the identified
defect type, the reason for such defect is found out and
the DP actions are suggested. These DP actions, when
introduced at all stages of a software lifecycle, can
reduce the time and resource necessary to develop
high quality systems.Prater pet Dees — Cntr Rept
Pramas n BLACK: Bip Oot In Froas RuneDeficiency in Requirements ( D1)
Pain Points
Root Causes
Wague Customer | Business Analyst (BA) not well
‘Needs Qualified
Timeline Accepting unreasonable timeline of
Problem the customer
Tnadequate End End User not spending quality time
User Review for review
Partial ‘Not using a formal requirements
Prerequisites checklist for analysis
Misinterpreted ‘Communication gap between project
Requirements stakeholders
Futurities ‘Add-on to the existing requirement,
without knowing it is a new
requirement
Training needed
Feasibility study
requirements signed off
Quality team validate check
BA and QT make DT
understand the req
BA and QT freeze scopeDESIGN FLAWS
Pain Points Root Cause:
Sub Opamal | Design features decenbed does not provide:
Design the best approach (optimal approach)
towards the solution required
‘Change in The design change would have a maximum:
Technical impact when new requirement is introduced
Design at a later stage of the project
‘Database ‘The design team might not have the
Design Issues | expertise to handle the normalization.
Missing Design | The Design template may have many
Astefact sections and only the major mandatory
sections might have got covered.
‘Ambiguous ‘Design document mcludes ambiguous use
Design of words or unclear design features
Traceability Tnconsistencies between design and
Issues requirement specification
PM &Design experts formalize
best design approach
New requirement undergo
formal configuration mgmt
PM seek help of design experts
QT need justification
Need support of document
writers
Traceability matrix usedDefective coding process
Pain Points
Root Causes
Defect Prevention
Project Leader's Inefficient Work Break Down (WBS) | PM should review the WBS
Inability in Resource _| structure assigned by the PL and
Allocation should take mitigation steps
if he foresees any risk in it
Lack of Technical Project team members are not trained | Skilled resources with
Expertise properly (or) may not have worked in | Technical work experience
similar technical domain. should be inducted into the
project. if project has tight
deadlines.
Missing Coding Due to shortage of time PL should find reasons for
Axtefact time constraints and should
talk with PM to see if the
estimated effort has to be
relooked for sufficiency
‘Non adherence to To complete the code fast and due to_| PM/Quality Team can cross
coding standards time constraints, developers may not | check if PL is doing the
adhere to coding standards code review or not
Incomplete code PL might feel that testing the code | Quality Team can do audits
review would suffice and may skip the / checks to see if review is
review. done formally.
Coding Exrors Programmers not familiar in Training should have been
developing similar type of codes or | provided before the
errors may be due to careless development starts.
mistakesDelinquency in testing
setup.
allocation.
Due to unavailability
of the Testing tool
S/W, Experienced
Testers not available.
‘While planning the project, the PM should decide
(along with the client) how these tests can be
performed Testers with expertise in such type of
testing can be used from other teams in the
Pain Points Root Causes Defect Prevention
Tnadequate | The testers may lack | The BA should run-through all possible scenarios
test cases knowledge on the of the actual Business to the Test team so that the
possible scenarios of |_| Testers can build a strong Test Case base for the
test path for the project
requirements.
Pretermission | Due to lack of ime The PM shouldn't hide the facts as up to what level
and schedule slippage, | the Testing is performed and should involve the
pressure might mount | client for a joint decision.
on the team to move to
the next level of
testing.
‘Dissimilar | Space restrictions in| Space restrictions, if any, should be handled by PM
test the Test machine and client mutually.
Environment
Uncovered | Unavailability of Size of database required for testing has to be
test bed Database space may _| planned (by PM & PL) upfront and sufficient time
simulation | restrict the Test bed should be given to the IT Team for DB space
Organization
Inadequate testing
Proper Planning should be done before start of
SDLC activities.Duration slippage
Pain Points Root Causes Defect Prevention
Synthesising risk | PM may not be analysing all possible _| PM should analyse all the risk
upfront risks. factors related to the project
like People. Environment,
Scope, Client, Sub vendors, etc
and all subcategories to it.
Programmer Programmer may not be Technically | Skilled programmers are to be
Capability capable to finish the work in the employed for tasks in the
allotted time. critical path
Uncovered Project control not exercised properly? | PM has to track in the Project
Milestone Monitoring milestones not done. Management Tool and Quality /
monitoring Senior Management Team
should do periodic reviews on
this.
Entertaining PM may not distinguish fitisanew | PM should have the overall
scope creep requirement (or) an addendum to the _| control of the project
existing requirement
Resource PM not judging on the exact resources | PM should plan for Human and
Unavailability | requirement for the project non-human resources in the
initial planning stage of the
project itself
Tnconsiderable _| PM/ PL may under estimate for PM / PL should always have
External external dependencies some buffer while planning for
external dependencies.Defect severity
Weightage
factor - 1
These are the extremely severe defects, which have already halted
cor capable of halting the operation of business system
Weightage
factor -3
These are also severe defects, which have not halted the system,
but have seriously degraded the performance of some business
operation
Weightage
factor - 5
These types of defects are the ones which are primarily related to
the presentation or the layout of the data. However, there is no
danger of corruption of data and incorrect valuesTest Maturity Model
Test Maturity Model is one of such model
which has a set of structured levels. TMM is
now replaced by Test Maturity Model
Integration(TMMI) is a 5 level model which
provides a framework to measure the
maturity of the testing processes.5. Optimization
STUN
Eaton
Test Maturity
Model (TMM)
ate)
By www.SoftwareTestingClass.com1.4 Testing Maturity Model
Benefits of test process improvement are the following:
> smarter testers
> higher quality software
> the ability to meet budget and scheduling goals
> improved planning
> the ability to meet quantifiable testing goals
oo* Maturity goals
Each maturity level, except 1, contains certain
maturity goals.
For an organization to reach a certain level, the
corresponding maturity goals must be met by the
organization.
* Maturity subgoals
Maturity goals are supported by maturity
subgoals¢ ATRs
Maturity subgoals are achieved by means of
ATRs.
ATRs address issues concerning implementation
of activities and tasks.
Three groups:
* Managers
* Developers and test engineers
* Customers1.5 TMM Levels
Level 5: Optimizution/Defect Prevemtion
and Quality Control
‘Test process optimization
‘Quality control
Application of process data for defect prevention
Program
Establish an organizationwide review program
Control and monitor the testing process
Costing: into the software life yclo
Integrate
Fatabliah a technical training program.
Establish a software tost organization
‘Level 2: Phase Definition
Instinitionalize basic teming techniques and methods
Initlate w test planning process
Develop teating and debugging goals> Level 1 — Initial
>There are no maturity goals to be met at this level.
» Testing begins after code is written.
> An organization performs testing to demonstrate that
the system works.
No serious effort is made to track the progress of
testing.
»Test cases are designed and executed in an ad hoc
manner.
>In summary, testing is not viewed as a critical, distinct
phase in software development.> Level 2 — Phase Definition: The maturity goals are as follows:
» Develop testing and debugging goals.
» Some concrete maturity subgoals that can support this goal are as follows:
> Organizations form committees on testing and debugging.
>The committees develop and document testing and debugging goals.
Initiate a test planning process. (Identify test objectives. Analyze risks. Devise
strategies. Develop test specifications. Allocate resources.)
»Some concrete maturity subgoals that can support this goal are as follows:
> Assign the task of test planning to a committee.
>The committee develops a test plan template.
» Proper tools are used to create and manage test plans.
» Provisions are put in place so that customer needs constitute a part of the
test plan.
> Institutionalize basic testing techniques and methods.
>The following concrete subgoals support the above maturity subgoal.
> An expert group recommends a set of basic testing techniques and
methods.
»The management establishes policies to execute the recommendations.> Level 3 — Integration: The maturity goals are as follows:
> Establish a software test group.
» Concrete subgoals to support the above are:
» An organization-wide test group is formed with leadership,
support, and $$.
>The test group is involved in all stages of the software
development.
>Trained and motivated test engineers are assigned to the
group.
» The test group communicates with the customers.
> Establish a technical training program.
> Integrate testing into the software lifecycle.
» Concrete subgoals to support the above are:
»The test phase is partitined into several activities: unit,
integration, system, and acceptance testing.
> Follow the V-model.
> Control and monitor the testing process.
»Concrete subgoals to support the above are:
» Develop policies and mechanisms to monitor and control test
rojects.
» Define a set of metrics related to the test project.
> Be prepared with a contingency plan>Level 4 — Management and Measurement: The maturity goals are:
¥Establish an organization-wide review program.
> Maturity subgoals to support the above are as follows.
>The management develops review policies.
>The test group develops goals, plans, procedures, and recording
mechanisms for carrying out reviews.
> Members of the test group are trained to be effective.
Establish a test management program.
> Maturity subgoals to support the above are as follows.
> Test metrics should be identified along with their goals.
>A test measurement plan is developed for data collection and analysis.
> An action plan should be developed to achieve process improvement.
> Evaluate software quality.
Maturity subgoals to support the above are as follows.
>The organization defines quality attributes and quality goals for products.
>The management develops policies and mechanisms to collect test
metrics to support the quality goals.>Level 5 -Optimization, Defect Prevention and Quality Control: The maturity
goals are as follows:
> Application of process data for defect prevention
> Maturity subgoals to support the above are as follows.
» Establish a defect prevention team.
> Document defects that have been identified and removed.
» Each defect is analyzed to get to its root cause.
» Develop an action plan to eliminate recurrence of common defects.
» Statistical quality control
Maturity subgoals to support the above are as follows.
» Establish high-level measurable quality goals. (Ex. Test case execution
rate, defect arrival rate, ...)
> Ensure that the new quality goals form a part of the test plan.
> The test group is trained in statistical testing and analysis methods.