You are on page 1of 82

Chapter 26

Quality Management
- Quality concepts
- Software quality assurance
- Software reviews
- Statistical software quality assurance
- Software reliability, availability, and safety
- SQA plan

(Source: Pressman, R. Software Engineering: A Practitioner’s Approach. McGraw-Hill, 2005)


Quality Concepts
What is Quality Management
• Also called software quality assurance (SQA)
• Serves as an umbrella activity that is applied throughout the software
process
• Involves doing the software development correctly versus doing it over
again
• Reduces the amount of rework, which results in lower costs and
improved time to market
• Encompasses
– A software quality assurance process
– Specific quality assurance and quality control tasks (including formal
technical reviews and a multi-tiered testing strategy)
– Effective software engineering practices (methods and tools)
– Control of all software work products and the changes made to them
– A procedure to ensure compliance with software development standards
– Measurement and reporting mechanisms

3
Quality Defined

• Defined as a characteristic or attribute of something


• Refers to measurable characteristics that we can compare to known
standards
• In software it involves such measures as cyclomatic complexity,
cohesion, coupling, function points, and source lines of code
• Includes variation control
– A software development organization should strive to minimize the
variation between the predicted and the actual values for cost, schedule,
and resources
– They should make sure their testing program covers a known percentage
of the software from one release to another
– One goal is to ensure that the variance in the number of bugs is also
minimized from one release to another

4
Quality Defined (continued)

• Two kinds of quality are sought out


– Quality of design
• The characteristic that designers specify for an item
• This encompasses requirements, specifications, and the design of the system
– Quality of conformance (i.e., implementation)
• The degree to which the design specifications are followed during
manufacturing
• This focuses on how well the implementation follows the design and how well
the resulting system meets its requirements
• Quality also can be looked at in terms of user satisfaction

User satisfaction = compliant product


+ good quality
+ delivery within budget and schedule

5
Quality Control

• Involves a series of inspections, reviews, and tests used throughout the


software process
• Ensures that each work product meets the requirements placed on it
• Includes a feedback loop to the process that created the work product
– This is essential in minimizing the errors produced
• Combines measurement and feedback in order to adjust the process
when product specifications are not met
• Requires all work products to have defined, measurable specifications
to which practitioners may compare to the output of each process

6
Quality Assurance Functions
• Consists of a set of auditing and reporting functions that assess the
effectiveness and completeness of quality control activities
• Provides management personnel with data that provides insight into
the quality of the products
• Alerts management personnel to quality problems so that they can
apply the necessary resources to resolve quality issues

7
The Cost of Quality
• Includes all costs incurred in the pursuit of quality or in performing
quality-related activities
• Is studied to
– Provide a baseline for the current cost of quality
– Identify opportunities for reducing the cost of quality
– Provide a normalized basis of comparison (which is usually dollars)
• Involves various kinds of quality costs (See next slide)
• Increases dramatically as the activities progress from
– Prevention  Detection  Internal failure  External failure

"It takes less time to do a thing right than to explain why you did it wrong." Longfellow

8
Kinds of Quality Costs

• Prevention costs
– Quality planning, formal technical reviews, test equipment, training
• Appraisal costs
– Inspections, equipment calibration and maintenance, testing
• Failure costs – subdivided into internal failure costs and external
failure costs
– Internal failure costs
• Incurred when an error is detected in a product prior to shipment
• Include rework, repair, and failure mode analysis
– External failure costs
• Involves defects found after the product has been shipped
• Include complaint resolution, product return and replacement, help line
support, and warranty work

9
Software Quality Assurance
Software Quality Defined

Definition: "Conformance to explicitly stated functional and


performance requirements, explicitly documented development
standards, and implicit characteristics that are expected of all
professionally developed software"

(More on next slide) 11


Software Quality Defined
(continued)
• This definition emphasizes three points
– Software requirements are the foundation from which quality is measured;
lack of conformance to requirements is lack of quality
– Specified standards define a set of development criteria that guide the
manner in which software is engineered; if the criteria are not followed,
lack of quality will almost surely result
– A set of implicit requirements often goes unmentioned; if software fails to
meet implicit requirements, software quality is suspect
• Software quality is no longer the sole responsibility of the programmer
– It extends to software engineers, project managers, customers,
salespeople, and the SQA group
– Software engineers apply solid technical methods and measures, conduct
formal technical reviews, and perform well-planned software testing

12
The SQA Group
• Serves as the customer's in-house representative
• Assists the software team in achieving a high-quality product
• Views the software from the customer's point of view
– Does the software adequately meet quality factors?
– Has software development been conducted according to pre-established
standards?
– Have technical disciplines properly performed their roles as part of the
SQA activity?
• Performs a set of of activities that address quality assurance planning,
oversight, record keeping, analysis, and reporting (See next slide)

13
SQA Activities
• Prepares an SQA plan for a project
• Participates in the development of the project's software process description
• Reviews software engineering activities to verify compliance with the defined
software process
• Audits designated software work products to verify compliance with those
defined as part of the software process
• Ensures that deviations in software work and work products are documented
and handled according to a documented procedure
• Records any noncompliance and reports to senior management
• Coordinates the control and management of change
• Helps to collect and analyze software metrics

14
Software Reviews
Purpose of Reviews
• Serve as a filter for the software process
• Are applied at various points during the software process
• Uncover errors that can then be removed
• Purify the software analysis, design, coding, and testing activities
• Catch large classes of errors that escape the originator more than other
practitioners
• Include the formal technical review (also called a walkthrough or
inspection)
– Acts as the most effective SQA filter
– Conducted by software engineers for software engineers
– Effectively uncovers errors and improves software quality
– Has been shown to be up to 75% effective in uncovering design flaws
(which constitute 50-65% of all errors in software)
• Require the software engineers to expend time and effort, and the
organization to cover the costs
16
Formal Technical Review (FTR)
• Objectives
– To uncover errors in function, logic, or implementation for any
representation of the software
– To verify that the software under review meets its requirements
– To ensure that the software has been represented according to predefined
standards
– To achieve software that is developed in a uniform manner
– To make projects more manageable
• Serves as a training ground for junior software engineers to observe
different approaches to software analysis, design, and construction
• Promotes backup and continuity because a number of people become
familiar with other parts of the software
• May sometimes be a sample-driven review
– Project managers must quantify those work products that are the primary
targets for formal technical reviews
– The sample of products that are reviewed must be representative of the
products as a whole
17
The FTR Meeting
• Has the following constraints
– From 3-5 people should be involved
– Advance preparation (i.e., reading) should occur for each participant but should
require no more than two hours a piece and involve only a small subset of
components
– The duration of the meeting should be less than two hours
• Focuses on a specific work product (a software requirements specification, a
detailed design, a source code listing)
• Activities before the meeting
– The producer informs the project manager that a work product is complete and
ready for review
– The project manager contacts a review leader, who evaluates the product for
readiness, generates copies of product materials, and distributes them to the
reviewers for advance preparation
– Each reviewer spends one to two hours reviewing the product and making notes
before the actual review meeting
– The review leader establishes an agenda for the review meeting and schedules the
time and location
(More on next slide) 18
The FTR Meeting (continued)
• Activities during the meeting
– The meeting is attended by the review leader, all reviewers, and the producer
– One of the reviewers also serves as the recorder for all issues and decisions
concerning the product
– After a brief introduction by the review leader, the producer proceeds to "walk
through" the work product while reviewers ask questions and raise issues
– The recorder notes any valid problems or errors that are discovered; no time or
effort is spent in this meeting to solve any of these problems or errors
• Activities at the conclusion of the meeting
– All attendees must decide whether to
• Accept the product without further modification
• Reject the product due to severe errors (After these errors are corrected, another review
will then occur)
• Accept the product provisionally (Minor errors need to be corrected but no additional
review is required)
– All attendees then complete a sign-off in which they indicate that they took part in
the review and that they concur with the findings

(More on next slide) 19


The FTR Meeting (continued)
• Activities following the meeting
– The recorder produces a list of review issues that
• Identifies problem areas within the product
• Serves as an action item checklist to guide the producer in making corrections
– The recorder includes the list in an FTR summary report
• This one to two-page report describes what was reviewed, who reviewed it,
and what were the findings and conclusions
– The review leader follows up on the findings to ensure that the producer
makes the requested corrections

20
FTR Guidelines
1) Review the product, not the producer
2) Set an agenda and maintain it
3) Limit debate and rebuttal; conduct in-depth discussions off-line
4) Enunciate problem areas, but don't attempt to solve the problem
noted
5) Take written notes; utilize a wall board to capture comments
6) Limit the number of participants and insist upon advance
preparation
7) Develop a checklist for each product in order to structure and focus
the review
8) Allocate resources and schedule time for FTRs
9) Conduct meaningful training for all reviewers
10) Review your earlier reviews to improve the overall review process

21
Statistical Software Quality
Assurance
Process Steps
1) Collect and categorize information (i.e., causes) about software
defects that occur
2) Attempt to trace each defect to its underlying cause (e.g.,
nonconformance to specifications, design error, violation of
standards, poor communication with the customer)
3) Using the Pareto principle (80% of defects can be traced to 20% of
all causes), isolate the 20%

23
A Sample of Possible Causes
for Defects
• Incomplete or erroneous specifications
• Misinterpretation of customer communication
• Intentional deviation from specifications
• Violation of programming standards
• Errors in data representation
• Inconsistent component interface
• Errors in design logic
• Incomplete or erroneous testing
• Inaccurate or incomplete documentation
• Errors in programming language translation of design
• Ambiguous or inconsistent human/computer interface

24
Six Sigma
• Popularized by Motorola in the 1980s
• Is the most widely used strategy for statistical quality assurance
• Uses data and statistical analysis to measure and improve a company's
operational performance
• Identifies and eliminates defects in manufacturing and service-related
processes
• The "Six Sigma" refers to six standard deviations (3.4 defects per a
million occurrences)

(More on next slide) 25


Six Sigma (continued)
• Three core steps
– Define customer requirements, deliverables, and project goals via well-
defined methods of customer communication
– Measure the existing process and its output to determine current quality
performance (collect defect metrics)
– Analyze defect metrics and determine the vital few causes (the 20%)
• Two additional steps are added for existing processes (and can be done in
parallel)
– Improve the process by eliminating the root causes of defects
– Control the process to ensure that future work does not reintroduce the causes
of defects

26
Six Sigma (continued)
• All of these steps need to be performed so that you can manage the
process to accomplish something
• You cannot effectively manage and improve a process until you first do
these steps (in this order):

Manage and improve the work process

Control the work process

Analyze the work process

Measure the work process


Define the work process
The work to be done

27
Software Reliability,
Availability, and Safety
Reliability and Availability
• Software failure
– Defined: Nonconformance to software requirements
– Given a set of valid requirements, all software failures can be traced to design or
implementation problems (i.e., nothing wears out like it does in hardware)
• Software reliability
– Defined: The probability of failure-free operation of a software application in a
specified environment for a specified time
– Estimated using historical and development data
– A simple measure is MTBF = MTTF + MTTR = Uptime + Downtime
– Example:
• MTBF = 68 days + 3 days = 71 days
• Failures per 100 days = (1/71) * 100 = 1.4
• Software availability
– Defined: The probability that a software application is operating according to
requirements at a given point in time
– Availability = [MTTF/ (MTTF + MTTR)] * 100%
– Example:
• Avail. = [68 days / (68 days + 3 days)] * 100 % = 96%
29
Software Safety
• Focuses on identification and assessment of potential hazards to
software operation
• It differs from software reliability
– Software reliability uses statistical analysis to determine the likelihood
that a software failure will occur; however, the failure may not necessarily
result in a hazard or mishap
– Software safety examines the ways in which failures result in conditions
that can lead to a hazard or mishap; it identifies faults that may lead to
failures
• Software failures are evaluated in the context of an entire computer-
based system and its environment through the process of fault tree
analysis or hazard analysis

30
SQA Plan
Purpose and Layout
• Provides a road map for instituting software quality assurance in an
organization
• Developed by the SQA group to serve as a template for SQA activities
that are instituted for each software project in an organization
• Structured as follows:
– The purpose and scope of the plan
– A description of all software engineering work products that fall within
the purview of SQA
– All applicable standards and practices that are applied during the software
process
– SQA actions and tasks (including reviews and audits) and their placement
throughout the software process
– The tools and methods that support SQA actions and tasks
– Methods for assembling, safeguarding, and maintaining all SQA-related
records
– Organizational roles and responsibilities relative to product quality

32

Software Testing and Quality
Assurance
Software Quality Assurance
Reading Assignment
• Roger S. Pressman, “Software Engineering:
A Practitioner’s Approach,” Fifth Edition,
McGraw-Hill Book Company Europe,
2001.
– Chapter 8: Software Quality Assurance

34
Objectives
• Learn what is software quality assurance
(SQA).
• Learn the major quality factors.
• Understand how reviews are conducted.

35
Topics Covered
• Quality concepts
• Software quality factors
• Software reviews
• The ISO 9001

36
Software Quality Assurance (SQA)
• SQA encompasses
(1) a quality management approach,
(2) effective software engineering technology (methods
and tools),
(3) formal technical reviews that are applied throughout
the software process,
(4) a multi-tiered testing strategy,
(5) control of software documentation and the changes
made to it,
(6) a procedure to ensure compliance with software
development standards (when applicable), and
(7) measurement and reporting mechanisms. 37
Software Quality
• Software quality:
– Conformance to explicitly stated requirements and
standards
• Quality assurance:
– is the activity that leads to “fitness of purpose”.
• Quality product:
– is the one that does what the customer expects it to
do.
User satisfaction = compliant product + good quality +
delivery within budget and schedule
38
Quality Concepts
• Quality control: a series of inspections, reviews, and
tests to ensure a product meets the requirements
placed upon it.
– Includes a feedback loop to the process that created the
work product.
– Quality control activities may be fully automated, entirely
manual, or a combination of automated tools and human
interaction.
• Quality assurance: analysis, auditing and reporting
activities.
– provide management with the data necessary to be
informed about product quality,
39
Software Quality Factors
• Functionality, Usability, Reliability, Performance, and
Supportability (FURPS) quality factors
– Functionality: feature set, capabilities, generality of functions,
and security
– Usability: human factors like consistency, and documentation
– Reliability: frequency and severity of failures, accuracy of
outputs, mean time between failures, ability to recover,
predictability
– Performance: processing speed, response time, resource
consumption, throughput, and efficiency
– Supportability: extensibility, adaptability, maintainability,
testability, compatibility, configurability

40
Why SQA Activities Pay Off?
cost to find
and fix a defect
100 60.00-100.00
log
scale
10 10.00
3.00
1.50
1 0.75 1.00

Design test field


Req. code system use
test
41
The Quality Movement
• Total Quality Management (TQM) is a
popular approach for management practice.
• TQM stresses continuous process
improvement and can be applied to software
development.
• Not much can be done to improve quality
until a visible, repeatable, and measurable
process is created.
42
TQM
1. Refers to a system of continuous process improvement.
- develop a process that is visible, repeatable, and measurable.
2. This step examines intangibles that affect the process and
works to optimize their impact on the process.
3. This step concentrates on the user of the product.
Examining the way the user applies the product. This step
leads to improvement in the product itself and, potentially,
to the process that created it.
4. This is a business-oriented step that looks for opportunity
in related areas identified by observing the use of the
product in the marketplace.

43
Software Quality Assurance
• The SQA group must look at software from
the customer's perspective, as well as
assessing its technical merits.
• The activities performed by the SQA group
involve quality planning, oversight, record
keeping, analysis and reporting.

44
Software Quality Assurance
(Cont.)
SQA Process Formal
Definition & Technical
Standards Reviews

Analysis
& Test
Reporting Planning
& Review
Measurement

45
Software Quality Assurance (Cont.)
• Beginning of the project
– Project manager will consider quality factors and
decide which ones are important for the system
– Decide on what validation and verification activities
will be carried out to check that the required quality
factors are present in the product
• During the project
– Validation and verification of quality standards and
procedures
• End of the project
– Expected quality achieved to what extent
46
Software Reviews
• Any work product (including documents) should
be reviewed.
• Conducting timely reviews of all work products
can often eliminate 80% of the defects before any
testing is conducted.
• This message often needs to be carried to
managers in the field, whose impatience to
generate code sometimes makes them reluctant to
spend time on reviews.

47
What are Reviews and What Reviews are
• Reviews are: not?
– A meeting conducted by technical people for technical
people
– A technical assessment of a work product created
during the software engineering process
– A software quality assurance mechanism
• Reviews are not:
– A project budget summary
– A scheduling assessment
– An overall progress report
48
Formal Technical Reviews (FTR)
• The objectives of FTR are:
– To uncover errors in functions, logic, or
implementation for any representation of the software.
– To verify that the software under review meets its
requirements.
– To ensure that the software has been represented
according to predefined standards.
– To achieve software that is developed in a uniform
manner.
– To make projects more manageable. 49
Notes on Formal Technical
Reviews
• Review the product, not the producer
• Set an agenda and maintain it
• Take written notes
• Limit the number of participants and insist
upon advance preparation
• The duration of the review meeting should
be less than two hours.
• Develop a checklist for each product that is50
likely to be reviewed
The Players
review
leader standards bearer (SQA)

producer

Maintenance
Oracle

recorder reviewer
user rep

51
How an FTR is Performed
• Producer informs the project leader that the product is
complete and that a review is required
• The project leader forms a review team and appoints a
review leader
• The review leader evaluate the product for readiness,
generates copies of the product material, distributes to
the reviewers, and schedules a review meeting
• Each reviewer reviews the product. He becomes
familiar with the product and makes notes of concerns

52
How an FTR is Performed (Cont.)
• Review leader, all reviewers, and producer
attend the meeting
• One of the reviewers take the role of recorder
• During the meeting, the producer walks through
the product, explaining the material, while the
reviewers raise issue based on their preparation.
If an error is discovered, then it is recorded by
the recorder.

53
Conducting the Review
1. Be prepared—evaluate product before the review

2. Review the product, not the producer

3. Keep your tone mild, ask questions


instead of making accusations

4. Stick to the review agenda

5. Raise issues, don't resolve them


6. Avoid discussions of style—stick to technical correctness

7. Schedule reviews as project tasks


8. Record and report all review results

54
Outcome of an FTR
• The attendees of the meeting decide whether to:
– Accept the product without further modification
– Accept the product provisionally. Minor errors have been
encountered. These must be fixed but no additional review will
be needed
– Reject the product due to sever errors. Once the errors are
fixed, another review should be conducted
• At the end of an FTR, a review summary report should
be produced. It should answer the following
– What was reviewed?
– Who was involved in the review?
– What were the findings and conclusions?
55
Outcome of an FTR (Cont.)

56
Review Checklists
• FTR can be conducted during each step in the
software engineering process.
• Checklists can be used to assess products that are
derived as part of software development.
• The checklists are not intended to be
comprehensive, but rather to provide a point of
departure for each review.

57
Software Project Planning
• Software project planning develops estimates for
resources, cost and schedule based on the
software allocation established as part of the
system engineering activity.
• Like any estimation process, software project
planning is inherently risky.
• The review of the Software Project Plan
establishes the degree of risk.

58
Software Project Planning (Cont.)
• The following checklist is applicable:
– Is software scope unambiguously defined and
bounded?
– Is terminology clear?
– Are resources adequate for scope?
– Are resources readily available?
– Have risks in all important categories been
defined.
– Is a risk management plan in place?
– Are tasks properly defined and sequenced?
59
Software Project Planning (Cont.)
– Is parallelism reasonable given available resources?
– Is the basis for cost estimation reasonable? Has the
cost estimate been developed using two independent
methods?
– Have historical productivity and quality data been
used?
– Have differences in estimates been reconciled?
– Are pre-established budgets and deadlines realistic?
– Is the schedule consistent?

60
Software Requirements Analysis
• Reviews for software requirements analysis
focus on traceability to system requirements
and consistency and correctness of the
analysis model.
• A number of FTRs are conducted for the
requirements of a large system and may be
also followed by reviews and evaluation of
prototypes as well as customer meetings.
61
Software Requirements Analysis
(Cont.)
• The following topics are considered during
FTRs for analysis:
– Is information domain analysis complete,
consistent and accurate?
– Is problem partitioning complete?
– Are external and internal interfaces properly
defined?
– Does the data model properly reflect data
objects, their attributes and relationships.
62
– Are all requirements traceable to system level?
Software Requirements Analysis (Cont.)

– Has prototyping been conducted for the


user/customer?
– Is performance achievable within the
constraints imposed by other system elements?
– Are requirements consistent with schedule,
resources and budget?
– Are validation criteria complete?

63
Software Design
• Reviews for software design focus on data design,
architectural design and procedural design.
• In general, two types of design reviews are conducted:
– The preliminary design review assesses the translation of
requirements to the design of data and architecture.
– The second review, often called a design walkthrough,
concentrates on the procedural correctness of algorithms as
they are implemented within program modules.

64
Software Design (Cont.)
• The following checklists are useful for preliminary
design review:
– Are software requirements reflected in the software
architecture?
– Is effective modularity achieved? Are modules functionally
independent?
– Are interfaces defined for modules and external system
elements?
– Is the data structure consistent with information domain?
– Is data structure consistent with software requirements?
– Has maintainability been considered?
– Have quality factors been explicitly assessed?

65
Software Design (Cont.)
• The following checklists are useful for Design
walkthrough:
– Does the algorithm accomplishes desired function?
– Is the algorithm logically correct?
– Is the interface consistent with architectural design?
– Is the logical complexity reasonable?
– Has error handling been specified?
– Are local data structures properly defined?

66
Software Design (Cont.)

– Are structured programming constructs used


throughout?
– Is design detail amenable to implementation
language?
– Which are used: operating system or language
dependent features?
– Has maintainability been considered?

67
Coding
• Errors can be introduced as the design is
translated into a programming language.
• A code walkthrough can be an effective
means for uncovering these translation
errors.

68
Coding (Cont.)
• The following checklist assumes that a design
walkthrough has been conducted and that algorithm
correctness has been established as part of the design
FTR.
– Has the design properly been translated into code? [The results
of the procedural design should be available during this
review.]
– Are there misspellings and typos?
– Has proper use of language conventions been made?
– Is there compliance with coding standards for language style,
comments, module prologue?

69
Coding (Cont.)
– Are there incorrect or ambiguous comments?
– Are data types and data declaration proper?
– Are physical constants correct?
– Have all items on the design walkthrough
checklist been re-applied (as required)?

70
Software Testing
• Software testing is a quality assurance
activity in its own right.
• The completeness and effectiveness of
testing can be dramatically improved by
critically assessing any test plans and
procedures that have been created.

71
Test Plan
• The following checklists are useful for test plan
walkthrough:
– Have major test phases properly been identified and
sequenced?
– Has traceability to validation criteria/requirements been
established as part of software requirements analysis?
– Are major functions demonstrated early?
– Is the test plan consistent with overall project plan?
– Has a test schedule been explicitly defined?
– Are test resources and tools identified and available?
– Has a test record keeping mechanism been established?

72
Test Procedure
• The following checklists are useful for test
procedure walkthrough:
– Have both white and black box tests been specified?
– Have all independent logic paths been tested?
– Have test cases been identified and listed with
expected results?
– Is error-handling to be tested?
– Have all boundary values been tested?
– Are timing and performance to be tested?
– Has acceptable variation from expected results been
specified? 73
Maintenance
• The review checklists for software development are equally valid
for the software maintenance phase. In addition to all of the
questions posed in the checklists, the following special
considerations should be kept in mind:
– Have side effects associated with change been considered?
– Has the request for change been documented, evaluated and approved?
– Has the change, once made, been documented and reported to interested
parties?
– Have appropriate FTRs been conducted?
– Has a final acceptance review been conducted to ensure that all software
has been properly updated, tested and replaced?

74
Statistical Quality Assurance
• Each defect needs to be traced to its cause.
• Defect causes having the greatest impact on
the success of the project must be addressed
first.

75
Statistical SQA
• collect information on all
defects
Product • find the causes of the
& Process defects
• move to provide fixes for
the process

measurement
... an understanding of how
to improve quality ...

76
Metrics Derived from Reviews
• Inspection time per page of documentation
• Inspection time per KLOC or FP
• Inspection effort per KLOC or FP
• Errors uncovered per reviewer hour
• Errors uncovered per preparation hour
• Errors uncovered per SE task (e.g., design)
• Number of minor errors (e.g., typos)
• Number of major errors (e.g., nonconformance to
requirements)
• Number of errors found during preparation

77
Software Reliability
• Software consistency: repeatability of results.
• Reliability: probability of failure free
operation for a specified time period.
– Don’t apply hardware reliability theory to software
(e.g. a key point is that, unlike hardware, software
does not wear out so that failures are likely to be
caused by design defects).

78
IEEE Standard: Software
Quality Assurance Plan

79
The ISO 9001 Quality Standards
• One of the most important worldwide
standards for quality assurance
• Adopted for use by over 130 countries
• Not industry specific but expressed in
general terms
• ISO 9001 is the quality assurance standard
that contains 20 requirements that must be
present in any software quality assurance
system. 80
The ISO 9001: How it Works?
• A software developer implements a quality system according
to ISO 9001 specifications
• The quality system is used for some time to detect any
problems in the system
• Third party audit
• Accreditation request to the responsible body
• Accreditation body inspect the quality system documentation
and visits the organization
• On successful inspection, a certificate is issued
• Unannounced visits to check whether the quality system is
adhered to or not

81
Key Points
• Quality assurance: is the activity that leads to “fitness of
purpose”
• Major software quality factors: functionality, usability,
reliability, performance, and supportability.
• Any work product (including documents) should be
reviewed.
• FTR are to find errors during the process so that they don’t
become defects after the release of the software.
• Checklists can be used to assess products that are derived as
part of software development.
• ISO 9001 is the quality assurance standard that contains 20
requirements that must be present in any software quality
assurance system.
82

You might also like