You are on page 1of 5



PRACTICE OBJECTIVE Audit: An inspection/assessment activity that verifies

compliance with plans, policies, and procedures, and
This glossary of testing terminology has two ensures that resources are conserved. Audit is a staff
objectives: first to define the test terms that will be used function; it serves as the "eyes and ears" of manage-
throughout this manual; and second to provide a basis ment.
for establishing a glossary of testing terminology for
your organization. Information services (I/S) Automated Testing: That part of software testing that
organizations that use a common testing vocabulary are is assisted with software tool(s) that does not require
better able to accelerate the maturity of their testing operator input, analysis, or evaluation.
Beta Testing: Testing conducted at one or more end
QAI believes that the testing terminology as user sites by the end user of a delivered software product
defined in this glossary is the most commonly held or system.
definition for these terms. Therefore, QAI recommends
that this glossary be adopted as a set of core testing term Black-box Testing: Functional testing based on
definitions. The glossary can then be supplemented requirements with no knowledge of the internal program
with testing terms that are specific to your organization. structure or data. Also known as closed-box testing.
These additional terms might include:
Bottom-up Testing: An integration testing technique
° Names of acquired testing tools that tests the low-level components first using test
drivers for those components that have not yet been
° Testing tools developed in your organ- developed to call the low-level components for test.
Boundary Value Analysis: A test data selection
° Names of testing libraries technique in which values are chosen to lie along data
extremes. Boundary values include maximum, mini-
° Names of testing reports mum, just inside/outside boundaries, typical values, and
error values.
° Terms which cause specific action to
occur Brainstorming: A group process for generating
creative and diverse ideas.
Branch Coverage Testing: A test method satisfying
Acceptance Testing: Formal testing conducted to coverage criteria that requires each decision point at
determine whether or not a system satisfies its accep- each possible branch to be executed at least once.
tance criteria—enables an end user to determine
whether or not to accept the system. Bug: A design flaw that will result in symptoms
exhibited by some object (the object under test or some
Affinity Diagram: A group process that takes large other object) when an object is subjected to an
amounts of language data, such as a list developed by appropriate test.
brainstorming, and divides it into categories.
Cause-and-Effect (Fishbone) Diagram: A tool used to
Alpha Testing: Testing of a software product or system identify possible causes of a problem by representing the
conducted at the developer's site by the end user. relationship between some effect and its possible cause.

Copyright © 1997 · Quality Assurance Institute® · Orlando, Florida

QC. 1.1.1

Cause-effect Graphing: A testing technique that aids Defect Density: Ratio of the number of defects to
in selecting, in a systematic way, a high-yield set of test program length (a relative number).
cases that logically relates causes to effects to produce
test cases. It has a beneficial side effect in pointing out Desk Checking: A form of manual static analysis
incompleteness and ambiguities in specifications. usually performed by the originator. Source code
documentation, etc., is visually checked against
Checksheet: A form used to record data as it is requirements and standards.
Dynamic Analysis: The process of evaluating a
Clear-box Testing: Another term for white-box testing. program based on execution of that program. Dynamic
Structural testing is sometimes referred to as clear-box analysis approaches rely on executing a piece of
testing, since "white boxes" are considered opaque and software with selected test data.
do not really permit visibility into the code. This is also
known as glass-box or open-box testing. Dynamic Testing: Verification or validation performed
which executes the system's code.
Client: The end user that pays for the product received,
and receives the benefit from the use of the product. Error: 1) A discrepancy between a computed,
observed, or measured value or condition and the true,
Control Chart: A statistical method for distinguishing specified, or theoretically correct value or condition; and
between common and special cause variation exhibited 2) a mental mistake made by a programmer that may
by processes. result in a program fault.

Customer (end user): The individual or organization, Error-based Testing: Testing where information about
internal or external to the producing organization, that programming style, error-prone language constructs,
receives the product. and other programming knowledge is applied to select
test data capable of detecting faults, either a specified
Cyclomatic Complexity: A measure of the number of class of faults or all possible faults.
linearly independent paths through a program module.
Evaluation: The process of examining a system or
Data Flow Analysis: Consists of the graphical analysis system component to determine the extent to which
of collections of (sequential) data definitions and specified properties are present.
reference patterns to determine constraints that can be
placed on data values at various points of executing the Execution: The process of a computer carrying out an
source program. instruction or instructions of a computer.

Debugging: The act of attempting to determine the Exhaustive Testing: Executing the program with all
cause of the symptoms of malfunctions detected by possible combinations of values for program variables.
testing or by frenzied user complaints.
Failure: The inability of a system or system component
Defect: NOTE: Operationally, it is useful to work with to perform a required function within specified limits.
two definitions of a defect: 1) From the producer's A failure may be produced when a fault is encountered.
viewpoint: a product requirement that has not been met
or a product attribute possessed by a product or a Failure-directed Testing: Testing based on the knowl-
function performed by a product that is not in the edge of the types of errors made in the past that are
statement of requirements that define the product. 2) likely for the system under test.
From the end user's viewpoint: anything that causes end
user dissatisfaction, whether in the statement of Fault: A manifestation of an error in software. A fault,
requirements or not. if encountered, may cause a failure.

Defect Analysis: Using defects as data for continuous Fault-based Testing: Testing that employs a test data
quality improvement. Defect analysis generally seeks to selection strategy designed to generate test data capable
classify defects into categories and identify possible of demonstrating the absence of a set of prespecified
causes in order to direct process improvement efforts. faults, typically, frequently occurring faults.

QC. 1.1.2

Fault Tree Analysis: A form of safety analysis that written material that consists of two dominant
assesses hardware safety to provide failure statistics and components: product (document) improvement and
sensitivity analyses that indicate the possible effect of process improvement (document production and
critical failures. inspection).

Flowchart: A diagram showing the sequential steps of Instrument: To install or insert devices or instructions
a process or of a workflow around a product or service. into hardware or software to monitor the operation of a
system or component.
Formal Review: A technical review conducted with the
end user, including the types of reviews called for in the Integration: The process of combining software
standards. components or hardware components, or both, into an
overall system.
Functional Testing: Application of test data derived
from the specified functional requirements without Integration Testing: An orderly progression of testing
regard to the final program structure. Also known as in which software components or hardware components,
black-box testing. or both, are combined and tested until the entire system
has been integrated.
Function Points: A consistent measure of software size
based on user requirements. Data components include Interface: A shared boundary. An interface might be a
inputs, outputs, etc. Environment characteristics hardware component to link two devices, or it might be
include data communications, performance, reusability, a portion of storage or registers accessed by two or more
operational ease, etc. Weight scale: 0 = not present; 1 = computer programs.
minor influence, 5 = strong influence.
Interface Analysis: Checks the interfaces between
Heuristics Testing: Another term for failure-directed program elements for consistency and adherence to
testing. predefined rules or axioms.

Histogram: A graphical description of individual Intrusive Testing: Testing that collects timing and
measured values in a data set that is organized accord- processing information during program execution that
ing to the frequency or relative frequency of occurrence. may change the behavior of the software from its
A histogram illustrates the shape of the distribution of behavior in a real environment. Usually involves
individual values in a data set along with information additional code embedded in the software being tested or
regarding the average and variation. additional processes running concurrently with software
being tested on the same platform.
Hybrid Testing: A combination of top-down testing
combined with bottom-up testing of prioritized or IV&V: Independent verification and validation is the
available components. verification and validation of a software product by an
organization that is both technically and managerially
Incremental Analysis: Incremental analysis occurs separate from the organization responsible for devel-
when (partial) analysis may be performed on an oping the product.
incomplete product to allow early feedback on the
development of that product. Life Cycle: The period that starts when a software
product is conceived and ends when the product is no
Infeasible Path: Program statements sequence that can longer available for use. The software life cycle
never be executed. typically includes a requirements phase, design phase,
implementation (code) phase, test phase, installation
Inputs: Products, services, or information needed from and checkout phase, operation and maintenance phase,
suppliers to make a process work. and a retirement phase.

Inspection: 1) A formal evaluation technique in which Manual Testing: That part of software testing that
software requirements, design, or code are examined in requires operator input, analysis, or evaluation.
detail by a person or group other than the author to
detect faults, violations of development standards, and Mean: A value derived by adding several qualities and
other problems. 2) A quality improvement process for dividing the sum by the number of these quantities.

QC. 1.1.3

Quality: A product is a quality product if it is defect that includes desk checking, walkthroughs, technical
free. Toproducer
To the ascertaina or appraise
product is aby comparing
quality to aif it
product Problem:
reviews, peerAny deviation
reviews, formal from defined
reviews, and standards.
inspections. Same
meets to apply to
or conforms a metric.
the statement of requirements that as defect.
defines the product. This statement is usually shortened Run Chart: A graph of data points in chronological
to: quality means 1) meets
The act or process of NOTE:
requirements. measuring. 2) Procedure:
order used toThe step-by-step
illustrate trendsmethod
or cycles followed to ensure
of the charac-
A figure, extent,the
Operationally, orwork
amount obtained
quality refersbytomeasuring.
products. that standards
teristic are met. for the purpose of suggesting an
being measured
assignable cause rather than random variation.
Metric: Assurance
Quality A measure(QA): of the extent
The setorofdegree
support to activities
which a Process: The work effort that produces a product. This
product possesses
(including facilitation,
and exhibits
certain quality,and Scatter
includesPlot (correlation
efforts of peoplediagram):
and equipment A graph guideddesigned
analysis) or attribute.
to provide adequate confidence that to show whether
policies, standards, there
a relationship between two
processes are established and continuously improved in changing factors.
Mutation to produce
A method that
to meet specifications
determine test set and Process Improvement: To change a process to make
are fit for measuring the extent to which a test set Semantics:
thoroughness 1) The relationship
the process produce a given product of characters
faster, more or a group
can discriminate the program from slight variants of the of characters to
economically, ortheir meanings,
of higher quality. independent
Such changes of the
program. Control (QC): The process by which product manner
require theof their
to be changed. and use. 2) The rate must
The defect
quality is compared with applicable standards; and the relationships be maintainedbetween symbols and their meanings.
or reduced.
action taken when
Nonintrusive Testing: nonconformance
Testing that isistransparent
detected. Its to the
softwareis defect
test; i.e., and removal.
testing This
that does is change
not a line the Software
Product: Characteristic:
The output of a An process;
the work
possibly product.
timing or that is, the performance
processing characteristics of of
the tasks is the
software There are
dental, trait,
property of products:
of software manu-
under test from of its
people working
in a realwithin the process. example,
environment. factured products
and custom),attributes,
Usually involves additional hardware that collects tive/information
constraints, number products
of states,
lines ofletters,
etc.), and
timing orImprovement: To changeand
processing information a production
processes process
that service products (physical, intellectual, physiological,
so that the rate
information at whichplatform.
on another defective products (defects) are Software Feature: AProducts
and psychological). softwareare characteristic
defined by specified
a statement
produced is reduced. Some process changes may require or implied by requirements
of requirements; documentation
they are produced by one or (for example,
the product to Requirements:
be changed. Qualitative and quantita- functionality,
people working performance,
in a process.attributes, or design
tive parameters that specify the desired operational constraints).
capabilities Testing: An essentially
of a system and serveblack-box
as a basistesting
for deter- Product Improvement: To change the statement of
mining theinoperational
which a program is tested
effectiveness and bysuitability
randomlyof a Software
requirements Tool:thatA computer
defines a product
programtoused make to the
system prior a subset of all possible input values. The
to deployment. more satisfying
develop, test, analyze,
and attractive
or maintain to theanother
end user computer
distribution may be arbitrary or may attempt to competitive).
program or itsSuch
changes maye.g., addautomated
to or delete design
accurately Testing: Testing performed
the distribution of inputsby in
end the listcompilers,
tools, of attributes testand/or
tools, and the list
of functions tools.
user on software
application in its normal operating environment.
environment. a product. Such changes frequently require the process
to be changed. TheNOTE:measureThis usedprocess
to evaluate
result in and
Regression Testing: services,
Outputs: Products, Selectiveor retesting to detect
information faultsto identify
supplied totally newnonconformance.
product. The basis upon which
meet end user during
needs. modification of a system or system adherence to policies is measured.
component, to verify that modifications have not caused Standardize: Productivity: Procedures
The ratio ofare theimplemented
output of a processto ensureto the
Path Analysis: adverse
Programeffects, or to verify
analysis that atomodified
performed input,
identify that theusually
of a process in the same units.atItaisdesired
is maintained
system or system
all possible paths component still meetstoits
through a program, specified
detect incom- frequently useful to compare the value added to a
plete paths, or to discover portions of the program that product by a process to the value of the input resources
are not on any path. required (using
Statement fair market
Coverage Testing:values A testfor both input
method and
Reliability: The probability of failure-free operation for coverage output). criteria that requires each statement be
Coverageperiod.Testing: A test method satisfying executed at least once.
coverage criteria that each logical path through the Proof Checker: A program that checks formal proofs
program is tested. A formal
through the of: program
1) an attribute to Statement
often are of programofproperties
Requirements: for logicalThe correctness.
exhaustive list of
be possessed
grouped into by the product
a finite or a function
set of classes; one path to from
be each requirements that define a product. NOTE: The
performed by the product; 2) the performance standard statement
class is tested. Prototyping: Evaluating requirements
of requirements should document or designs at the
for the attribute or function; or 3) the measuring process ments conceptualization
proposed and phase, the requirements
rejected (including theanalysisreason for
to be Reviews: A methodical
used in verifying that theexamination
standard has of software
been met. phase,
the or design
rejection) phasethe
during byrequirements
quickly building scaled-down
work products by the producer's peers to identify defects process. components of the intended system to obtain rapid
and areasAwhereway tochanges
use thearediversity
needed. and power of a feedback of analysis and design decisions.
group of people to point out needed improvements in a Static Testing: Verification performed without
Policy: Managerial
product or confirm those desires andof
parts intents concerning
a product in which Qualification
executing the Testing: Formal
system's code. testing,
Also calledusually con-
static analysis.
either process (intended objectives) or
improvement is either not desired or not needed. A products (desired ducted by the developer for the end user, to demonstrate
review is a general work product evaluation technique Statistical Processmeets
that the software Control: The use of
its specified statistical

QC. 1.1.4 QC. 1.1.5