You are on page 1of 94

IT Quality

Metrics and Metric Analyses


Session Objectives
• Increase you knowledge about metrics/ measures
• Understand the process of collecting, storing and
analyzing metrics
– Understand terms ‘process capability’ and ‘process performance’
– Use measurement data to control and improve process
performance
– Understand issues faced while addressing statistical process
control for software
– Understand what it means to control and predict the software
process
• IS NOT TO
– Provide you with an answer to all problems
– Tell you how to do the job

2
Setting the Context
• “Which way I ought to go from here?"
"That depends deal on where you want to go."
"I don't much care where –"
"Then it doesn't matter which way you go.” 
― Lewis Carroll, Alice in Wonderland
• If you don’t know where you are going, any road will do -
CHINESE PROVERB
• If you don’t know where you are, a map won’t help - WATTS
S. HUMPHREY
• Only Measured processes can be Managed
• Cannot measure, cannot control! - Peter Drucker

3
Why Measure
Do we have the answers to the following:
1. How much did we produce?
2. Are our products of high quality?
3. Are we improving?
4. How well do we compare with other companies?
5. Are our customers satisfied with us?
6. What is the ROI of software process improvement
initiatives?

Without the right information, you are just another person with
an opinion
4
Why Measure
• Processes are critical to executing strategies
and plans aimed at business objectives
• Processes must be controlled and improved to
achieve organizational business goals

5
Why Measure
• Management view - Dashboard tells at a glance
– What is being achieved
– What is the quality and productivity
– When Situation is out of Control
– Key Elements to focus on
• Engineering view - Measures help to
– better understand attributes of software that we have produced
– assess the quality of our product

It is not really a question of “Why measure?” but


“Why not measure?”

6
Management Dilemma
Information Overload
– Low level data
– Too many measures

Selecting key indicators


– Quality?
– Size?
– Progress?

7
• Let’s start at the very beginning

8
Basics
• Process - A set of planned and systematic activities
implemented to achieve certain goals or objectives
• Product - The result or output of a process or set of
processes
• Measure - A quantified observation on some attribute or
aspect of the software product, process or project
• Metric - A quantitative determination of the extent to which
a system, component or process possesses a certain
attribute - generally a ratio
• Population — all items of interest
• Sample — subset of data from the population
• Random sample — every item in the population has an
equal chance of being in the sample
9
Exercise
• What are these - Product or Process:
– System design
– Functional Specifications
– Testing
– Documentation
– Training
• What are these – Measure or Metric
– No. of Function points
– No. of defects per FP
– Defects per man month
– No. of defects

10
Exercise
• What are these - Product or Process:
– System design: Product
– Functional Specifications: Product
– Testing: Process
– Test Plans: Product
– Training : Process
• What are these – Measure or Metric
– No. of Function points: Measure
– No. of defects per FP: Metric
– Defects per man month: Metric
– No. of defects : Measure

11
Metric Attributes
• Simple, Precise, Definable
• Objective
• Easily Obtainable
• Valid
• Robust

12
Types of Data
• Attribute data (qualitative, words)
– Categories (strongly agree, agree, etc. . .)
– Yes, no
– Pass/ fail; good/ bad

13
Types of Data
• Variable data (quantitative)
– Discrete (count) data
• Data is not capable of being meaningfully subdivided
into more precise increments
• Sample size needed is much larger
• E.g.;# of times customer hangs up before response
– Continuous data
• Decimal subdivisions are meaningful
• Ex: time to answer the telephone
• Sample size of 30 is usually adequate
14
Measures of Central Tendency
• Mean
– influenced by extreme values
– most commonly used measure of the center of
the distribution
• Median
– The middle value
– Not influenced by extreme values

Weighted, geometric, harmonic mean


15
Centering and Spread

xx x x x
xxx x
xxxx
xxx x xx
x
x
x x x
xx

16
Measures of Dispersion
• Range = Largest value minus smallest
value
• The standard deviation is a measure of
the spread of the data
• The variance is the square of the
standard deviation
• The range and standard deviation are both
sensitive to extreme values
17
Normal Distribution
• Distribution of data with certain consistent properties
• Properties useful to understand characteristics of the
underlying process from which the data were obtained
• Most natural phenomena and man-made processes can
be represented as normally distributed
• A continuous random variable X has a normal
distribution if its values fall into a smooth (continuous)
curve with a bell-shaped pattern
• Each normal distribution has its own mean and SD
• Mean and median are the same and lie directly in the
middle of the distribution (due to symmetry)
18
- 6  - 5 - 4 - -2 - 1  + 1 + 2 + 3 + 4 + 5
+ 6

68.3%
95.4%
99.7%
99.99999975%

19
The Process

20
The Process

21
The Process
• Confirm Measurement Objectives
– Important to know why we want to Measure:
measurement need
– At times, certification requirements
– Determined by
• Management Perceptions
• Client requirements
• Above all, extent of problems we face
– Set quantifiable measurement goals based on past
experience and data
22
The Process

23
The Process
• Define Scope of Measurement
– Boundaries of measurement
– Types of projects
– Supporting processes
– Boundaries within the organization
– Maybe organization wide

24
The Process

25
The Process
• Prepare Project “Metric” Plan
– Resource Requirements - staffing -
funding
– Roles and Responsibilities
– Training, Education and Organisational
Involvement
– Schedule - Time Frames
– Benefits
26
The Process

27
The Process
• Select Metrics
Popular method used is GQM

Goal: Improve the timeliness of change request


processing from the project manager's viewpoint
Question: Is the performance of the process
improving?
Metrics: Current average cycle time, Subjective
rating of manager's satisfaction
28
The Process
• Select Metrics
Popular method used is GQM

Goal: Improve the timeliness of change request


processing from the project manager's viewpoint
Question: What is the current change request
processing speed?
Metrics: Average cycle time, Standard deviation,
% Cases outside of the upper limit
29
The Process

30
The Collection Process
• Define collection procedures
– Source of data
– Collection responsibilities
– Frequency of collection
– Tools required
– Data validation and update
• Define forms to record measures
• Design metrics database - Should lend itself to
analysis and study
The Process

32
The Process
• Train and Educate
– Concepts to all, especially Senior Mgt.
– Motivation / Benefits & Uses

33
The Process

34
The Collection Process
• Collect Data
– Capture data during, not after
– Choose sample project
– Verify consistency and validity
– Defined forms tailored to suit project needs
– Database updated at predefined frequency
– Flag best achievements / practices
Collect Data
• Identify Sources of Information
• Capture Data during, not after
• Tools often useful
• Choose sample project, based on
– ease
– motivation of project personnel
– representative project

36
The Process

37
The Process
• Analyse and fine tune Metrics and
Processes
– Read the Data and ask - ‘So what?’
– Convert Data into Information
– Is the Data true?
– What is the Root Cause?
– Any process fine tuning required?
– Any metrics to be added / dropped
38
The Process
– Analyze data at two levels:
• Project level to take decisions – enabling in-
flight course corrections
• Organizational level to understand process
capability – periodicity / data from minimum #
of completed projects
– Causal analysis – based on procedures
– Prepare and Present report
– Review and revise procedures to include
feedback
39
Data analysis
• Define analysis to be done and methods of analysis
– Correlations and Interactions between measures
– Define threshold values and warning limits
• Define goals; Ensure alignment
• Define reporting formats and how report should be used
• Devise mechanism for feedback
• Based on past data, set up process performance baselines
and process capability baselines
• Based on control limits,
– identify action points
– process improvement opportunities
Continuous Improvement
Metrics Cycle
• Start of Project: Determine list of Metrics to be collected
• Collection & Analysis: Throughout life cycle; At end of
every phase (SRS, Design..)
• Track project status, Corrective & Preventive action based
on Quantitative facts
• Project Close: Post Mortem analysis;
• Organizational Analysis
Process Management
Improve
Process

Define Control Measure


Process Process Process

Execute
Process

43
Product Metrics
– Measurement of Productivity
– Measurement of Size
• Function Points
• Lines of code
• Feature Points
– Measurement of Quality
• Defect profiles
• Reliability
Product Metrics - Quality
• No Single Metric for Software Quality
• Several interesting metrics can be used,
primarily tracking defects and analyzing
them
• IEEE standards :
– Defect Metrics
– Reliability - Mean time between failure
– Complexity
Defect Severity Metric
Metric :
• No. of serious defects per Function Point or
• No. of defects of Severity 1 or 2 per Kilo LOC
Data to be captured?

Severity 1System inoperable


Severity 2Major functions disabled / incorrect
Severity 3Minor functions disabled / incorrect
Severity 4Superficial functions
Defect Age

Indicates the no. of phases defect has lived


through since Introduction - Time from
Introduction to Detection -

Average Age =  Phase (Detected - Introduced)


Number of Defects
Defect Density
No of Defects in system/ Size
– Defects per KLOC (Kilo Lines of Code)
– Defects per Function Point
– Defects per module / program
Indicates the overall number of defects that

may be present in the system

Points to Error Prone Module


Defect Distribution
• Indicates % of defects during each sdlc phase
• Attention should be more for those phases
where defect distribution is high
• Can be done based on severity also
Defect Distribution
Doc Oth SRS SRS
Test Des ig n
Planning Design
Co d e
Tes t Planning
Do cument at io n
Ot hers
Defect Removal Cost
• Preparation cost
• Execution cost
• Repair cost

Useful to track; May also track Effort


Defect Rate
• Plot time Vs. No. of defects found
• Frequency/ Cumulative frequency
Reliability
Mean Time to Failure (MTTF)
• Basic parameter required by most
reliability models
• Records failure time - elapsed time
between failures
• Time units depend on application - CPU
time or wall clock time?
• Weigh by severity of failure
Reliability
Mean Time to Repair (MTTR)
• Correlate to Complexity?
Ities

54
Process Metrics
Key processes that we will focus on:
• Project Management
• Estimation
• Quality Control
• Quality Assurance
• Configuration Management
Project Management Metrics
• Schedule slippage
= Completion (Actual - Planned)
No. of planned elapse days
• Effort overrun
= Effort (Actual - Planned)
Planned effort
Project Management Metrics

• Productivity = Size / Effort


• Cost overrun = Cost (Actual - Estimated)
Estimated cost
Project Management Effort overrun
= PM Effort (Actual - Planned)
Planned effort
Estimation process metrics
• Changes to size estimates
– either FPs or LOCs or others
– initial estimate / reestimate
– Re estimate with same scope of work
– Re estimate with altered scope
• Changes to effort schedule estimates
Metrics for Planning & Tracking
Actual vs. Planned Effort (person-days)
Effort
Provides visibility into contributions of staffing on
Varianc
project costs, schedule, & product quality
e
Metrics for Planning & Tracking …

Actual vs. planned costs


Cost
Provides tracking of actual costs against estimated
Varianc
costs and predicts future costs
e
Metrics for Planning & Tracking …

Size  Actual vs. Planned Size (KLOC)


 Basis for estimation, scheduling, effort & work
Variance
allocation.
Basis for Quantitative Quality Goals & Targets

Actual vs. planned task completions


Schedule
Actual vs. planned durations
Variance
 Provides information on project performance

w.r.t its schedule


Quality Control Process
• Test Coverage
= # of statements tested
Total # of statements

• Review Coverage (e.g. for SRS)


= # of requirements reviewed
Total # of requirements
Quality Control Process
• Test effectiveness ratio
= # of defects found by testing
Total no. of defects

• Review effectiveness ratio


= # of defects found by reviews
Total no. of defects
Quality control metrics
• No. of reviews - actual vs. planned
• No. of tests - actual vs. planned
• Effort spent in QC vs. planned

• No. of defects - actual vs. expected, if


appropriate
Quality Assurance Process
• No. of deviations / waivers to process
• No. of Non conformances during audits
• Deployment of tools
• Training imparted
• Deployment of standards
Effort Distribution
• Indicates Effort for each activity in
SDLC
• Helps in scheduling and work
distribution.
Effort Distribution
SRS
Des ign
Code
Tes t P lanning
Tes ting
PM
Tr aining
QA
CM
Efficiency Metrics
Review Efficiency:
• Indicates efficiency of work product review
in removing the defects during review
• Also indicates the number of defects
passed on to the subsequent phases
• E.g. SRS Review Efficiency = No. of
Requirements. Defects captured in SRS
review/ No of SRS defects in system
Efficiency Metrics …
Test Efficiency:
• % of SRS, Design and Code type
defects detected during testing.
• Indicates efficiency of Testing process
in removing the defects
• E.g. Test Efficiency = Defects captured
during Testing/ Defects detected during
and after testing (AT, Support)
Defect Removal Efficiency
• Points to QC effectiveness before shipment
• DRE = No. of Defects found prior to delivery/Total
No. of Defects
• Indication of review and testing process.

• Also indicates the no. of defects being passed on

to the customer
Residual Defect Density
 No. of defects after Testing/ Size

 Indicates the no. of defects passed on to the customer


Other potential metrics in the family

• Requirements coverage metric (also called


Functionality metric):
= # of requirements met
Total number of requirements
• Defect density severity code wise
• No. of change req processed per unit time
• No of change requests per product
• Training effectiveness based on feedback
• Service uptime ratio (facilities mgmnt)
Process Metrics …
No. of Requirements added, changed, deleted / Total No. of
Require 

Requirements
ment  Stability of requirements, frequency of requirements

changed by customer
Stability  Increase in RSI increases schedule slippage
Customer Satisfaction
• Once in a while activity; Requires active
soliciting
• May be done by other than project personnel
• Impact may go beyond product
• What is the best method? Interviews? Mail
Surveys? Both?
• User associations / media may also do these
Customer Satisfaction
• Factors: Business Ethics, Quality of
deliverables, Project Management,
Creativity in design, Responsiveness, etc.
• Assign scale/ weight to evaluate customer
satisfaction index
• Identify means and mechanisms for
enhanced customer satisfaction and
repeat business opportunities
Maintenance Metrics
• Effort required for handling each Customer
Complaint
• Effort to clear a Help Desk Call
• Adherence to committed Service Levels
• Resource Utilization Index - Helps in planning
manpower
• Quality of Maintenance / Bad Fix Rate / Bug
Reopen Rate
Maintenance Metrics
To suit the needs of maintenance projects:
• Effort required for handling each Customer Complaint
• Effort to clear a Help Desk Call
• Adherence to committed Service Levels
• Resource Utilization Index - Helps in planning manpower
• Quality of Maintenance / Bad Fix Rate / Bug Reopen Rate
Effort to Clear a Help Desk Call:
• Effort required for handling each Customer Complaint
• Helps in planning manpower
• Helps in keeping up customer commitment
• Average effort to clear a HDC = Total effort to clear HDCs/ Total number of
HDC
Maintenance Metrics
Clearance Time:
• Time (duration) required for solving each customer complaint.
• Helps in planning manpower and in keeping customer commitment
• Average HDC Clearance Time = Total Time to clear HDCs/ Total
number of HDC
Quality of Maintenance / Bad Fix Rate / Bug Reopen Rate
• The number of defects / bugs reported by customer on the work
requests / change requests / bug reports completed by maintenance
team
Resource Utilization Index
• Total hours reported by Team against number of standard working
hours available in a week / month
Maintenance Metrics …
Adherence to Committed Service Levels
• Percentage of work completed against
agreed service levels with the customer
• Monitored Periodically
REMEMBER!

WHAT YOU CANNOT MEASURE, YOU


CANNOT MANAGE
Metrics in Project
Management
• Planning: Basis for cost estimating, training &
resource planning, scheduling & budgeting
• Controlling: Basis for tracking, controlling and
taking action on variances
• Improving: Tool to identify improvement areas
& measure effects of process improvement
efforts
Statistical Process Control
• Use of statistical tools and techniques
(e.g. control charts ) to analyze a process
or its outputs to control, manage and
improve the quality of the output or the
capability of the process
SPC for Software Premises
• The software process is performed by people, not
machines
• The software process is (or can be) repeatable, not
repetitive
• The act of measuring and analysis will change behavior
Business Value
Goals, objectives, strategies and plans in all organizations
are based on two fundamental needs.
– Providing competitive products or services in terms of
functionality, time-to-market, quality and cost
– Meeting commitments to customers with respect to
products and services

Success in meeting commitments means that


commitments must be achievable. This implies the need
to predict outcomes.
Listening to voices
Voice of the process = the natural bounds of process
performance

Voice of the Customer = the goals established for


product and process performance

Capable process = stable process + product


conformance

Process Capability DOES NOT EQUATE TO capable process


Stability
Concern: Is the process that we are
managing behaving Predictably?

Business value: foundation for estimating


(predicting) and making commitments
Capability
Concerns: Is the process capable of
delivering products that meet requirements

Does the performance of the process meet


the business needs of the organization ?

Business value: the foundation for making


commitments
Techniques for Metrics Analysis

Control Charts –
• To discover variability in process
• Determine whether a process is in statistical control
• To differentiate Random variation and Normal Variation
• Determine special causes of variance
• Control charts plotted for all metrics (Effort, Schedule, Defects.)
Control Chart Basics
Why Control Charts ?
• Control Charts lets you know what your processes can
do, so that you can set achievable goals
• Control Charts provides the evidence of stability that
justifies predicting process performance
• Control Charts separate signal from noise so that you
can recognize a process change when it occurs
• Control charts identify unusual events. They point you to
fixable problems and potential process improvements
Control Chart “Causes”
CONTROL CHART FOR DEVELOPMENT PROJECT
TEST PROJECT
Phases SRS DESIGN CODING PLANNI. TESTING ACCEPT. SUPPOT. MGT. QA TRG. CM
Effort Variance
(overall) % 16.76% 23.70% 17.45% 28.31% 17.60% 20.20% 18.44% 23.22% 19.40% 10.40% 17.40%
Baseline 20.13% 20.13% 20.13% 20.13% 20.13% 20.13% 20.13% 20.13% 20.13% 20.13% 20.13%
LCL 15.92% 15.92% 15.92% 15.92% 15.92% 15.92% 15.92% 15.92% 15.92% 15.92% 15.92%
UCL 24.33% 24.33% 24.33% 24.33% 24.33% 24.33% 24.33% 24.33% 24.33% 24.33% 24.33%

EFFORT VARIANCE

30.00%
28.31%
23.70%
25.00% 24.33%
20.20% 23.22%
20.00% 19.40% 20.13%
EFFORT VARIANCE

17.45% 17.60% 18.44% 17.40%


15.00% 15.92%
16.76%
10.00% 10.40%

5.00%

0.00%
0 2 4 6 8 10 12 14
Measurement and Process
Management

91
Source: SEI
Summary
Metrics Program -
Organizational
Define Metrics to
Be collected – Establish Data Store data for
Project & Support Collection Mechanism Future use
Group Metrics In Metrics
Database

Analyze Metrics

Arrive at Organizational Process


Capability Baseline
Metrics Program - Projects
Organizational Process
Capability Baseline

Pass project’s
Projects Set data for Future
Customer’s
Quantitative Goals – Use in Metrics
Quality Goals
On Process & Products Database

Projects do in-process metrics analysis


to take Decisions based on actual
performance v/s the set goals

You might also like