You are on page 1of 18

Defect Target Metrics & Exit Criteria Metrics

VTB&SC
Vietnamese Testing Board and Software Consulting
ABOUT THIS PRESDENTATION
SOFTWARE QUALITY METRICS METHODOLOGY
 Establish
 Identify
 Implement
 Analyze
 Validate

METRICS
 Defect Target Metrics
 Exit Criteria Metrics
Software Quality Metrics Methodology
DEFINITION

Metric: A measurement scale and the method used f


or measurement. [ISO 14598]

Measurement: The process of assigning a number o


r category to an entity to describe an attribute of that
entity. [ISO 14598]
SOFTWARE QUALITY METRICS
IEEE 1061 : Standard for Software Quality Metrics Methodology
Establish quality requirement

Indentify Metrics

Implement Metrics

Analyze Metrics

Validate Metrics
IDENTIFY : INDICATOR / FACTOR
Measure Indicators / Factors / Inputs
Normalized per function point (or per LOC)
Defect quantities
By level of severity By category or cause,
McCabe's cyclomatic complexity
Complexity
Design complexity measures
Predicted defects and maintenance costs, based on complexity measures
Business losses per defect that occurs during operation
Business interruption costs; costs of work-arounds Lost sales and lost goodwill
Cost of defects
Litigation costs resulting from defects
Annual maintenance cost (per function point)
Annual operating cost (per function point) Measurable damage to your boss's career
Costs of reviews, inspections and preventive measures
Costs of test planning and preparation
Costs of test execution, defect tracking, version and change control
Costs of quality activities
Costs of diagnostics, debugging and fixing
Costs of tools and tool support
Costs of testing & QA education associated with the product
Re-work effort (hours, as a percentage of the original coding hours)
Re-work Re-worked LOC (source lines of code, as a percentage of the total delivered LOC)
Re-worked software components (as a percentage of the total delivered components)
Availability
Mean time between failure (MTBF)
Mean time to repair (MTTR)
Reliability
Reliability ratio (MTBF/ MTTR)
Number of product recalls or fix releases
Number of production re-runs as a ratio of production runs

* SATC Software Quality Model


IMPLIMENT : QUALITY METRICS
ROI (Return On Investment) Metrics
Defect Metrics
Productivity Metrics
Effort Metrics
Schedule Metrics
Change Metrics
End-user satisfaction Metrics
Technical Metrics
Warranty Metrics
Reputation Metrics
So on..
IMPLIMENT : QUALITY METRICS
Defect Metrics
 Defects Origin Metrics – coding, design, requirements etc. so that the
specific software engineering areas can be improved upon.
 Defect Removal Effectiveness - DRE = E / ( E + D ) Where E = No. of
Errors found before delivery of the software and D = No. of Errors found
after delivery of the software.
 Defect Density Metric - Usually more than 95% of the defects are
found within four years of the software's release. For application
software, most defects are normally found within two years of its release.
 Defect Category Metrics –Defect categories which can be maintained
(add, modify or delete) by the client.
 Defect Severity Metrics – ex) Critical, Major and Minor – so that more
effort can be dedicated to analyzing Critical defects and improvements
can be brought about
 Defect Resource Metrics – ex) Employment per Defect found,
Employment per defect recognition ration
 Defects per Code Metrics - Bugs per line of code
ANALYSE : ANALYSIS METHOD
Function Point Analysis (FPA):
Use Case Point Analysis
Feature Point Analysis & Estimate
Test Point Analysis (TPA)
Sampling Methods: (Random, Stratified, Cluster)
Cause & Effect Diagrams
ABC analysis
Cost and Schedule Estimation Methodologies/Techniques
Source Lines-of-Code Analysis
Lines-of-requirements Analysis
Number-of-classes and interfaces Analysis
Defect Target Metrics (DTM)
And
Exit Criteria Metrics (ECM)
DTM and ECM
Quality – Time - Cost

Exit Criteria
• Establish Metrics (ECM)
• Identify • Identify
• Implement • Implement
• Analyze • Analyze
Defect Target
Metrics (DTM)
DTM (DEFECT TARGET METRICS)- ESTABLISH
 System Integration Testing Period
 100% of the numbers of expected defects found during Integration
Testing Period. The number shall be based on Test Case Factor.

 Business Acceptance Testing Periods


 15% of the numbers of defects found in Integration Testing Period.

 Production Roll out Testing Period


 3% of the number of defects found in Integration Testing Period
With an assumption of Roll out testing Period as 2 weeks.

* Based on Risk, Function points Analysis, Previous project


DTM (DEFECT TARGET METRICS)- IDENTIFY
 Method
 Get the numbers as real as possible
 Get from previous Project
 Stats from same domain industry

 Requirement Size
 Corrected Line of Requirement
 The Number of Use Cases (ex. Total 14052 Use Cases)

 Product Volume Factor


 Line of Source Code
 The number of features
 Number of Menu

 Severity Factor
 Each Menu in Screen
 Each function / Module
DTM (DEFECT TARGET METRICS) - IDENTIFY
 Defect
 Defect per 25 lines of requirements (the QA Effort factor).
 Defect per 2.5 executed Test Cases.

 Test Case
 Test Case per 10 corrected lines of requirements.
 Test Case per 1 Use Case and 10 Test Procedure
 Test Procedure per 1 corrected lines of requirement.

 Ratio
 Distribution High / Low severity = 30 / 70.
 Function and Other Testing / Regression Testing = 80 / 20

 Exit Criteria : 3~10% of defect found depend on the such period.


DTM (DEFECT TARGET METRICS) - IMPLEMENT
System Integration Acceptance Production Total
Module
High Low High Low High Low High Low

A 10 24 3 11 2 3 15 38

B 9 21 4 9 1 1 14 31

C 15 38 2 8 0 1 17 47

D 35 10 5 9 0 1 40 20

E 19 45 2 5 1 1 22 51

F 14 34 4 5 2 3 20 42

G 30 70 3 9 1 1 34 80

H 57 103 7 9 0 1 64 113

I 42 98 5 12 0 1 47 111

J 25 45 2 1 1 2 28 48

K 27 55 4 7 0 1 31 63

L 11 18 2 2 0 1 13 21

M 20 50 2 4 0 1 22 55

N 2 3 1 1 1 1 4 5

Total 316 614 46 92 9 19 371 725


DTM (DEFECT TARGET METRICS) - IMPLEMENT
System Business
Production
Integration Acceptance Total
Business area (Roll out)
Testing Testing
High Low High Low High Low High Low

Defect Targets Metrics 306 614 46 92 9 19 371 725


Sub-Total 920 138 28 1096
ECM (EXIT CRITERIA MERTICS)- INDENTIFY
 Sytem Integration Testing Period
 15% of the numbers of expected defects found during Integration Testing
Period. The number shall be based on Test Case Factor.

 Business Acceptance Testing Period


 10% of the numbers of defects found in Integration Testing Period.

 Production Roll out Testing Period


 10% of the number of Low Priority defects found in Integration Testing
Period With an assumption of Roll out testing Period as 2 weeks,.

 Limitation
 NO ZERO DEFECT approach
 Regression Testing
 Lack of information, time and resource.
 Not reproducible issues
DTM (DEFECT TARGET METRICS) + ECM(EXIT CRITERIA METRICS)

System Business
Production
Integration Acceptance Total
Business area (Roll out)
Testing Testing
High Low High Low High Low High Low

Defect Targets Metrics 306 614 46 92 9 19 371 725


Sub-Total 920 138 28 1096

Exit Criteria Metrics 46 92 5 9 0 2 51 101


Sub-Total 138 14 2 152

Fixed 782 124 26 944

You might also like