You are on page 1of 7

Calibration

BIAS AND ACCURACY

Definition of Accuracy is a qualitative term referring to whether there


Accuracy and Bias is agreement between a measurement made on an object
and its true (target or reference) value. Bias is a
quantitative term describing the difference between the
average of measurements made on the same object and
its true value. In particular, for a measurement
laboratory, bias is the difference (generally unknown)
between a laboratory's average value (over time) for a
test item and the average that would be achieved by the
reference laboratory if it undertook the same
measurements on the same test item.

Depiction of Bias
and Unbiased
Measurements
Unbiased measurements relative to the
target

Biased measurements relative to the


target

Identification of Bias in a measurement process can be identified by:


Bias
1. Calibration of standards and/or instruments by a
reference laboratory, where a value is assigned to
the client's standard based on comparisons with the
reference laboratory's standards.

2. Measurement assurance programs, where artifacts


from a reference laboratory or other qualified
agency are sent to a client and measured in the
client's environment as a 'blind' sample.

3. Interlaboratory comparisons, where reference


standards or materials are circulated among several
laboratories.
Page 1 of 7
Calibration

Reduction of Bias Bias can be eliminated or reduced by calibration of


standards and/or instruments. Because of costs and time
constraints, the majority of calibrations are performed
by secondary or tertiary laboratories and are related to
the reference base via a chain of intercomparisons that
start at the reference laboratory.

Caution Errors that contribute to bias can be present even where


all equipment and standards are properly calibrated and
under control. Temperature probably has the most
potential for introducing this type of bias into the
measurements. For example, a constant heat source will
introduce serious errors in dimensional measurements
of metal objects. Temperature affects chemical and
electrical measurements as well.

Generally speaking, errors of this type can be identified


only by those who are thoroughly familiar with the
measurement technology.

Page 2 of 7
Calibration

CALIBRATION

The purpose of this section is to outline the procedures for calibrating


instruments while guaranteeing the 'goodness' of the calibration results.
Calibration is a measurement process that assigns values to the response of an
instrument relative to reference standards or to a designated measurement
process. The purpose of calibration is to eliminate or reduce bias in the user's
measurement system relative to the reference base. The calibration procedure
compares an "unknown" or test item(s) or instrument with reference standards
according to a specific algorithm.

Calibration Calibration is a measurement process that assigns


Reduces Bias values to the response of an instrument relative to
reference standards or to a designated measurement
process. The purpose of calibration is to eliminate or
reduce bias in the user's measurement system relative to
the reference base.

Instrument The calibration procedure compares an "unknown" or


Calibration test item(s) or instrument with reference standards
according to a specific algorithm.

Purpose of Instrument calibration is intended to eliminate or reduce


Instrument bias in an instrument's readings over a range for all
Calibration continuous values. For this purpose, reference standards
with known values for selected points covering the
range of interest are measured with the instrument in
question.

Instruments which The instrument reads in the same units as the reference
require Correction standards. The purpose of the calibration is to identify
for Bias and eliminate any bias in the instrument relative to the
defined unit of measurement.

Page 3 of 7
Calibration

Basic Steps for The calibration method requires the following basic
Correcting the steps:
Instrument for • Selection of reference standards with known values
Bias to cover the range of interest.
• Measurements on the reference standards with the
instrument to be calibrated.
• Functional relationship between the measured and
known values of the reference standards (usually a
least-squares fit to the data) called a calibration
curve.
• Correction of all measurements by the inverse of the
calibration curve.

Notation The following notation is used in discussing models for


calibration curves.

• Y denotes a measurement on a reference standard

• X denotes the known value of a reference standard

• denotes measurement error.

• a and b denote coefficients to be determined

Possible Form for There are several models for calibration curves that can
Calibration Curves be considered for instrument calibration. The simplest
one is the following which is linear:

Selection of A minimum of three reference standards (preferably


Reference five) is required for a linear calibration curve.
Standards

Number of A minimum of two measurements on each reference


Repetitions on each standard is required. These repetitions provide the data
Reference for determining whether a candidate model is adequate
Standard for calibrating the instrument.

Page 4 of 7
Calibration

Assumption The basic assumption regarding the reference values in


regarding the calibration experiment is that they are known
Reference Values without error.

Assumptions The basic assumptions regarding measurement errors


regarding associated with the instrument are that they are:
• free from outliers
Measurement
• independent
Errors
• from a normal distribution

Check Standards For linear calibration, it is sufficient to control the end-


needed for the points and the middle of the calibration interval to
Control Program ensure that the instrument does not drift out of
calibration. Therefore, check standards are required at
three points; namely,
• at the lower-end of the regime
• at the mid-range of the regime
• at the upper-end of the regime

Page 5 of 7
Calibration

WHAT CAN GO WRONG

Calibration There are several circumstances where the calibration


Procedure may will not reduce or eliminate bias as intended. A critical
Fail to Eliminate exploratory analysis of the calibration data should
Bias expose such problems.

Lack of Precision Poor instrument precision or unsuspected day-to-day


effects may result in standard deviations that are large
enough to jeopardize the calibration. There is nothing
intrinsic to the calibration procedure that will improve
precision, and the best strategy is to estimate the
instrument's precision in the environment of interest to
decide if it is good enough for the precision required.

Outliers in the Outliers in the calibration data can seriously distort the
Calibration Data calibration system. Isolated outliers should be deleted
from the calibration data.

Systematic It is possible for different operators to produce


Differences among measurements with biases that differ in sign and
Operators magnitude. This is not usually a problem for automated
instrumentation, but for instruments that depend on line
of sight, results may differ significantly by operator. To
diagnose this problem, measurements by different
operators on the same artifacts are plotted and
compared. Small differences among operators can be
accepted as part of the imprecision of the measurement
process, but large systematic differences among
operators require resolution. Possible solutions are to
retrain the operators or maintain separate calibration
curves by operator.

This aspect has been dealt separately in the Gause R &


R Study.

Page 6 of 7
Calibration

Lack of System The calibration procedure, once established, relies on


Control the instrument continuing to respond in the same way
over time. If the system drifts or takes unpredictable
excursions, the calibrated values may not be properly
corrected for bias, and depending on the direction of
change, the calibration may further degrade the
accuracy of the measurements. To assure that future
measurements are properly corrected for bias, the
calibration procedure should be coupled with a
statistical control procedure for the instrument.

Page 7 of 7

You might also like