This action might not be possible to undo. Are you sure you want to continue?
Mihaela Dinsoreanu, PhD Computer Science Department
SOFTWARE QUALITY METRICS OVERVIEW
Size Complexity Design features Performance
effectiveness of defect removal during development, pattern of testing defect arrival, response time of the fix process
number of software developers the staffing pattern over the life cycle of the software cost schedule productivity
SOFTWARE QUALITY METRICS
A subset of software metrics that focus on the quality aspects of the product, process, and project Divided into
end-product quality metrics y in-process quality metrics
END-PRODUCT QUALITY METRICS
Mean time to failure (MTTF)
Safety-critical systems (ex. US air traffic control system cannot be unavailable for more than 3s/year) many commercial software systems
Customer problems Customer satisfaction
According to the IEEE/ American National Standards Institute (ANSI) standard (982.2):
An error is a human mistake that results in incorrect software. y The fault is an accidental condition that causes a unit of the system to fail to function as required. y A defect is an anomaly in a product. y A failure occurs when a functional unit of a software-related system can no longer perform its required function or cannot perform it within specified limits.
DEFECTS Fall 2011 Errors (development process) => faults and defects in the system Faults/defects => failures (run-time) 1 fault => 0«* failures ´defect sizeµ = the probability of failure associated with a latent defect .
THE DEFECT DENSITY METRIC Fall 2011 Nominator? Denumerator? Time? Conceptual definition Defect rate = number of defects/ the opportunities for error (OFE) in a given timeframe .
Count lines as terminated by logical delimiters. comments. data definitions. and job control language.PRACTICAL DEFINITION Fall 2011 Defect rate = number of unique causes of observed failures/ size of the software (KLOC or FP) LOC y y y y y y Count only executable lines. Count lines as physical lines on an input screen. Count executable lines. Count executable lines. and comments. Count executable lines plus data definitions. data definitions. .
latent defect rate = 2.0 defects /KLOC during the next four years) y Next releases? y Old code New/changed code .LOC Fall 2011 Straight LOC count y Comparisons? Normalized to Assembler-equivalent LOC What if ? First release (50 KLOC.
changed. or enhanced.DEFECT RATE FOR NEW/CHANGED CODE Fall 2011 LOC count: For the entire software product y For the new and changed code of the release y Defect tracking: Defects must be tracked to the release origin the portion of the code that contains the defects y at what release the portion was added. y .
CHANGE FLAGGING Fall 2011 a new function => new and changed lines of code are flagged with a specific identification (ID) number. The ID is linked to the requirements number If the change-flagging IDs and requirements IDs are further linked to the release number of the product => LOC counting tools can use the linkages to count the new and changed code in new releases. .
EXAMPLE (IBM ROCHESTER) Fall 2011 LOC = instruction statements (logical LOC) Shipped Source Instructions (SSI) ² total product Changed Source Instructions (CSI) .new and changed code of the new release .
POST-RELEASE DEFECT RATE METRICS Fall 2011 Total defects per KSSI (a measure of code quality of the total product) ² process metric Field defects per KSSI (a measure of defect rate in the field) ² customer·s perspective Release-origin defects (field and internal) per KCSI (a measure of development quality) ² process metric Release-origin field defects per KCSI (a measure of development quality per defects found by customers) ² customer·s perspective .
8 x 20 = 36 .CUSTOMER·S PERSPECTIVE Fall 2011 Initial Release of Product Y KCSI = KSSI = 50 KLOC Defects/KCSI = 2. 0 x 50 = 100 Second Release KCSI = 20 KSSI = 50 + 20 (new and changed lines of code) 4 (assuming 20% are changed lines of code ) = 66 Defect/KCSI = 1.0 Total number of defects = 2.8 (assuming 10% improvement over the first release) Total number of additional defects = 1.
2 defects/KCSI or lower .CUSTOMER·S PERSPECTIVE Fall 2011 Third Release KCSI = 30 KSSI = 66 + 30 (new and changed lines of code) 6 (assuming the same % (20%) of changed lines of code) = 90 Targeted number of additional defects (no more than previous release) = 36 Defect rate target for the new and changed lines of code: 36/30 = 1.
2/1.8) to preserve the number of defects .CONCLUSION Fall 2011 Release 1 -> Release 2 y 64% reduction [(100 .36)/100] in the number of defects perceived by client Release 2 -> Release 3 y Defect rate has to be better (1.
FP Fall 2011 Definition: ´A function can be defined as a collection of executable statements that performs a certain task. ..µ (Conte et al. together with declarations of the formal parameters and local variables manipulated by those statements. 1986).
transaction types) x 4 Number of external outputs (e. not physical files) x 10 Number of external interface files (files accessed by the application but not maintained by it) x 7 Number of external inquiries (types of online inquiries supported) x 4 .g.FP COMPONENTS ² AVERAGE WEIGHTING FACTORS Fall 2011 Number of external inputs (e.. report types) x 5 Number of logical internal files (files as the user might conceive them.g..
6 .5.6 External output: low complexity. high complexity.7.15 External interface file: low complexity.7 Logical internal file: low complexity. high complexity. high complexity. high complexity.FP ² LOW AND HIGH WEIGHTING FACTORS Fall 2011 External input: low complexity -3.3.10 External inquiry: low complexity.4. high complexity.
external output component: if ((number of data element types <= 5) && (number of file types referenced)<=3)) then complexity := ´lowµ. y . y if ((number of data element types >= 20) && (number of file types referenced)>= 2 )) then complexity := ´highµ.COMPLEXITY Fall 2011 Classified based on a set of standards Ex.
average. high) xij are the numbers of each component in the application. .FP STEP 1: FUNCTION COUNTS Fall 2011 wij are the weighting factors of the five components by complexity level (low.
10. 14. 3.FP STEP 2: IMPACT OF GENERAL SYSTEM CHARACTERISTICS Fall 2011 Assign a score in [0. Data communications Distributed functions Performance Heavily used configuration Transaction rate Online data entry End-user efficiency Online update Complex processing Reusability Installation ease Operational ease Multiple sites Facilitation of change . 11. 13. 2. 12.. 9. 7. 5. 6. 4. 8.5] (ci) 1.
FP STEP 3: VALUE ADJUSTMENT FACTOR Fall 2011 .
and Best Practices. 2000] . Benchmarks.75 SEI CMM Level 2: 0.05 [Jones.27 SEI CMM Level 4: 0.44 SEI CMM Level 3: 0.14 SEI CMM Level 5: 0.EXAMPLE Fall 2011 Estimated defect rates per function point y y y y y SEI CMM Level 1: 0. Software Assessments.
CUSTOMER PROBLEMS METRIC Fall 2011 valid defects usability problems unclear documentation or information duplicates of valid defects (defects that were reported by other customers and fixes were available but the current customers did not know of them) user errors .
PROBLEMS PER USER MONTH (PUM) Fall 2011 PUM = Total problems that customers reported (true defects + non-defect-oriented problems) for a time period / Total license-months of the software during the period Number of license-months = Numbers of install licenses of the software * Number of months in the calculation period .
SUMMARY Fall 2011 Defect rate Numerator Valid and unique product defects Size of product (KLOC or function point) PUM All customer problems (defects and nondefects. first time and repeated) Customer usage of the product (user-months) Customer Denominator Measurement perspective Producer³ software development organization Scope Intrinsic product quality Intrinsic product quality plus other factors .
CUSTOMER SATISFACTION METRICS Fall 2011 Customer survey y y y y y Very satisfied Satisfied Neutral Dissatisfied Very dissatisfied. dissatisfied. Several y metrics Percent of completely satisfied customers y Percent of satisfied customers (satisfied and completely satisfied) y Percent of dissatisfied customers (dissatisfied and completely dissatisfied) y Percent of nonsatisfied (neutral. and completely dissatisfied) .
IN-PROCESS QUALITY METRICS Fall 2011 Defect Density During Machine Testing Defect Arrival Pattern During Machine Testing Phase-Based Defect Removal Pattern Defect Removal Effectiveness .
y .g. blitz test.). customer testing. add test cases to increase coverage.DEFECT DENSITY DURING MACHINE TESTING Fall 2011 ´The more defects found during testing. y Yes => then the quality perspective is the same or positive. stress testing. etc. the more defects will be found laterµ release-to-release comparisons Current defect rate <= previous release defect rate Does the testing for the current release deteriorate? No => the quality perspective is positive. y Yes => extra testing needed (e. y Current defect rate > previous release defect rate Did we plan for and actually improve testing effectiveness? No => the quality perspective is negative.
DEFECT ARRIVAL PATTERN DURING MACHINE TESTING Fall 2011 .
g. This is the true defect pattern.DEFECT ARRIVAL PATTERN DURING TESTING Fall 2011 The defect arrivals (defects reported) during the testing phase by time interval (e. week). The pattern of defect backlog over time.. This metric is a workload statement as well as a quality statement. The pattern of valid defect arrivals-when problem determination is done on the reported problems. .
PHASE-BASED DEFECT REMOVAL PATTERN Fall 2011 .
PHASE-BASED DEFECT REMOVAL PATTERN Fall 2011 Phases y y y y y y high-level design review (I0) low-level design review (I1) code inspection (I2) unit test (UT) component test (CT) system test (ST) .
DEFECT REMOVAL EFFECTIVENESS Fall 2011 .
METRICS FOR SOFTWARE MAINTENANCE Fall 2011 Fix backlog and backlog management index Fix response time and fix responsiveness Percent delinquent fixes Fix quality .
.FIX BACKLOG (FB) AND BACKLOG MANAGEMENT INDEX (BMI) Fall 2011 FB = workload statement for software maintenance = count of reported problems that remain at the end of each month or each week.
Fall 2011 .
the ability to meet one's commitment to the customer) . the agreed-to fix time.FIX RESPONSE TIME AND FIX RESPONSIVENESS Fall 2011 Fix response time = Mean time of all problems from open to closed Fix Responsiveness = f(customer expectations.
PERCENT DELINQUENT FIXES Fall 2011 .
Goal 3: Increase software reliability. . Goal 2: Increase defect containment. Goal 5: Improve customer service.EXAMPLES OF METRIC PROGRAMS MOTOROLA Fall 2011 Motorola Quality Policy for Software Development (QPSD) Goals y y y y y y y Goal 1: Improve project planning. Goal 7: Increase software productivity. Goal 4: Decrease software defect density. Goal 6: Reduce the cost of nonconformance.
MOTOROLA QUALITY POLICY FOR SOFTWARE DEVELOPMENT (QPSD) Fall 2011 Measurement Areas y y y y y y y y Delivered defects and delivered defects per size Total effectiveness throughout the process Adherence to schedule Accuracy of estimates Number of open customer problems Time that problems remain open Cost of nonconformance Software reliability .
1 What is the number of new problems opened during the month? Metric 5.1: New Open Problems (NOP) NOP = Total new postrelease problems opened during the month Question 5.GOAL 5: IMPROVE CUSTOMER SERVICE Fall 2011 Question 5.2 What is the total number of open problems at the end of the month? Metric 5.2: Total Open Problems (TOP) TOP = Total postrelease problems that remained open at the end of the month .
3: Mean Age of Open Problems (AOP) AOP = (Total time postrelease problems remaining open at the end of the month have been open)/(Number of open postrelease problems remaining open at the end of the month) Question 5.4: Mean Age of Closed Problems (ACP) ACP = (Total time postrelease problems closed within the month were open)/(Number of open postrelease problems closed within the month) .3: What is the mean age of open problems at the end of the month? Metric 5.4: What is the mean age of the problems that were closed during the month? Metric 5.GOAL 5: IMPROVE CUSTOMER SERVICE Fall 2011 Question 5.
and useful . accurate.COLLECTING SOFTWARE ENGINEERING DATA Fall 2011 must be based on well-defined metrics and models data classification schemes to be used level of precision must be specified the collection form should be pretested the information extracted from the data be focused.
. Collect and validate data. Develop a list of questions of interest.DATA COLLECTION METHODOLOGY Fall 2011 Basili and Weiss (1984): y y y y y y Establish the goal of the data collection. Design and test data collection forms. Analyze data. Establish data categories.
and usability . if not fixed. national language translation. would cause one or more of the following to occur: y y y y y A defect condition in a later inspection phase A defect condition during testing A field defect Nonconformance to requirements and specifications Nonconformance to established standards such as performance.EXAMPLE Fall 2011 Inspection defect: A problem found during the inspection process which.
INSPECTION SUMMARY FORM Fall 2011 .
INTERFACE DEFECTS Fall 2011 An interface defect is a defect in the way two separate pieces of logic communicate.. messages. These are errors in communication between: Components y Products y Modules and subroutines of a component y User interface (e.g. panels) y .
g. screen) Incorrect message used Presentation of information on screen not usable Missing required parameters (e. modules Setting up a common control block/area used by another piece of code incorrectly Not issuing correct exception to caller of code Low-Level Design (I1) y y y y y y Code (I2) y y y . missing parameter on module) Wrong parameters (e. application program interfaces (APIs).. specified incorrect parameter on module) Intermodule interfaces: input not there.g..EXAMPLES OF INTERFACE DEFECTS PER DEVELOPMENT PHASE Fall 2011 High-Level Design (I0) y y y y Use of wrong parameter Inconsistent use of function keys on user interface (e..g. input in wrong order Intramodule interfaces: passing values/data to subroutines Incorrect use of common data structures Misusing data passed to code Passing wrong values for parameters on macros.
High-level categories of this type of defect are as follows: y y y y y Function: capability not implemented or implemented incorrectly Assignment: initialization Checking: validate data/values before use Timing: management of shared/real-time resources Data Structures: static and dynamic definition of data .LOGIC DEFECT Fall 2011 A logic defect is one that would cause incorrect results in the function to be performed by the logic.
g.g. instead of YYYMSG). command) Logic does not implement I0 design Missing or excessive function Values in common structure not set Propagation of authority and adoption of authority (lack of or too much) Lack of code page conversion Incorrect initialization Not handling abnormal termination (conditions. command) Wrong keyword (e.g.. Missing requirements Missing parameter/field on command/in database structure/on screen you are implementing Wrong value on keyword (e. macro.EXAMPLES OF LOGIC DEFECTS PER DEVELOPMENT PHASE High-Level Design (I0) y y y y y y y y Fall 2011 Invalid or incorrect screen flow High-level flow through component missing or incorrect in the review package Function missing from macros you are implementing Using a wrong macro to do a function that will not work (e. macro.. exit routines) Lack of normal termination cleanup Low-Level Design (I1) y y y y y y y y . cleanup. using XXXMSG to receive a message from a program message queue..
EXAMPLES OF LOGIC DEFECTS PER DEVELOPMENT PHASE Fall 2011 Code (I2) y y y y y y y y y y y y Code does not implement I1 design Lack of initialization Variables initialized incorrectly Missing exception monitors Exception monitors in wrong order Exception monitors not active Exception monitors active at the wrong time Exception monitors set up wrong Truncating of double-byte character set data incorrectly (e.g.. truncating before shift in character) Incorrect code page conversion Lack of code page conversion Not handling exceptions/return codes correctly .
y Example of metrics program at Motorola . and project. Can be grouped into three categories in accordance with the software life cycle: end-product quality metrics.WRAP-UP Fall 2011 Software quality metrics focus on the quality aspects of the product. y maintenance quality metrics. process. y in-process quality metrics.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue listening from where you left off, or restart the preview.