You are on page 1of 51

NORDTEST

Tasks
The tasks of Nordtest are to promote the safety of life, health, environment and material
values and to encourage a free exchange to trade. The approach adopted by Nordtest to
achieve its objectives is:

- to develop, adopt and recommend test methods and to promote the use of these
by industry and the authorities and also in the standardisation work

- to obtain international recognition of test results and also the competence of the
Nordic countries, for instance by quality assurance and verification of testing
activity

- to endeavour that tests and approval of test results are made in a resource and
cost effective manner

- to promote the technical testing infrastructure in the Nordic countries by means


of research, development of competence and collaboration, and

- to participate in the European and international development of testing and to


promote Nordic interests.

Organisation
The organisation consists of a board, a secretariat and nine technical groups. These groups
are Acoustics and Noise, Building, Electronics, Environment, Fire, Mechanics, Polymers,VVS
(Mechanical Building Services) and Quality Assurance.

The work is directed by the board which comprises representatives of all the Nordic
countries. The members are appointed by the government or appropriate department of the
country concerned.

The technical groups initiate and evaluate projects. The projects are often structured in such
a way that they can be used as catalysist for development of the combined technical
competence in the Nordic countries. At present, about 250 Nordic projects are being carried
out in some 40 firms and institutions.

The board as well as the technical groups are assisted by the secretariat which is responsible
for day to day activity. The secretariat is located at Esbo, Finland.

Financial framework
The cost of the Nordtest secretariat and a large proportion of project activity is financed
from the budget of the Nordic Council of Ministers. The grant for 1992 is approx. 2 millions
ECU. The work of the board and the technical groups is financed by the participating
organisations.

Publications
- Register of 1300 test methods and technical reports
- Test methods
- Technical reports
NT TECHN REPORT 226
Approved 1994-02

Authors: NORDTEST project number1013-91-9


Esa Vitikainen
Institution: Technical Research Centre of Finland
Title (English): -

Title (Original): When do we need calibration of equipment used in testing laboratories?


Abstract:

One of the most important elements in a testing laboratory quality system is a practical,
well-functioning equipment management system. The validation of equipment with respect
to use can be carried out at various levels of uncertainty, depending mostly on the technical
capability of the equipment, the requirements set by the clients of the laboratory or by the
standard or test method used. Costs can be saved if no calibration is needed, or if calibration
can be performed using the most suitable and simplest way, which produces the acceptable
uncertainty level for the measurement in question.

The purpose of this report is to give recommendations for testing laboratories with respect
to calibration, and to present tools and data for facilitating the planning of a practical
equipment assurance system.

A process in the planning of an equipment assurance program is presented, as well as charts


for selecting measuring equipment, and their calibration levels are given for the basic
quantities length, mass, force, time, temperature, voltage, resistance and current. The
uncertainties given in the charts are intended as direction-giving, not as accurate values. The
measuring ranges do not necessarily cover all the measuring needs of a testing laboratory.
However, they might be useful in industrial measurements.

Technical Group: Quality Assurance


ISSN: 0283-7234 Language: English Pages: 47
Class (UDC): 620.1 Keywords: quality assurance, calibration, testing laboratories

Distributed by: Publication code:


NORDTEST
P.O.Box 116
FIN-02151 ESPOO
Finland
PREFACE

The project was financed by NORDTEST and managed by Esa Vitikainen (VTT,
the Technical Research Centre of Finland). Heikki Lehto, Tapio Manstén, Pekka
Immonen and Juha Sillanpää (VTT) contributed by supplying material and giving
their views for the report.
-3-

CONTENTS

ABSTRACT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

PREFACE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

CONTENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1 INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 BACKGROUND MATERIAL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.1 NORDTEST REPORTS ON CALIBRATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2 EQUIPMENT ASSURANCE PROGRAMMES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3 UNCERTAINTY BUDGET . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

3 CALIBRATION OF MEASURING EQUIPMENT FOR THE BASIC


QUANTITIES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.1 LENGTH . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Checking of length-measuring equipment without any special calibration apparatus . 14
Calibration of equipment when an uncertainty of 0.2 mm is allowed and the
measuring range is under 10 m . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Calibration of equipment when an uncertainty of 0.01 mm is allowed and
the measuring range is under 100 mm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Calibration of equipment when uncertainties in the
range around 0.001 mm are required . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Calibration intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Factors affecting the uncertainty of length measurement . . . . . . . . . . . . . . . . . . . . . 17
3.2 MASS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Weighing equipment classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Recommendations for weighing equipment calibration . . . . . . . . . . . . . . . . . . . . . . . 19
Factors affecting the uncertainty of weight measurement . . . . . . . . . . . . . . . . . . . . . . 21
3.3 FORCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Recommendations for force-measuring instrument calibration . . . . . . . . . . . . . . . . . 23
Factors affecting the uncertainty in force measurement . . . . . . . . . . . . . . . . . . . . . . . 23
3.4 TIME . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Uncertainty levels of different measuring instruments . . . . . . . . . . . . . . . . . . . . . . . . 25
Recommendations for calibration of instruments used in time
interval measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Factors affecting the uncertainty in time measurement . . . . . . . . . . . . . . . . . . . . . . . 26
3.5 TEMPERATURE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Recommendations for thermometer calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Factors affecting the uncertainty budget in temperature measurement . . . . . . . . . . . . 29
3.6 ELECTRICAL QUANTITIES; VOLTAGE, RESISTANCE AND
CURRENT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Resistance, DC vortage and current . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
-4-

AC voltage and current . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33


Uncertainty levels of different measuring instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Recommendations for instrument calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Factors affecting the uncertainty in electrical measurements . . . . . . . . . . . . . . . . . . . . . . . 36

4 SUMMARY AND SUGGESTIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

STANDARDS AND OTHER LITERATURE DEALING WITH


EQUIPMENT ASSURANCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
-5-

1 INTRODUCTION

One of the most important elements in a testing laboratory quality system is a prac-
tical, well functioning equipment management system. The validation of equip-
ment with respect to use can be carried out on various levels of uncertainty
depending mostly on the technical capability of the equipment, the uncertainty
requirements set by the clients of the laboratory or by the standard or test method
used. Costs can be saved if no calibration is needed, or if calibration can be per-
formed by using the most suitable and simplest way, which produces the accept-
able uncertainty level for the measurement in question.

Some general problems in testing organizations which are related to measurement


and calibration are listed in the following:

- The lack of understanding of the logical chain of measurement when defining


the accuracy requirements of the property to be measured, for the measuring
equipment and the calibration.

- There is no clear understanding which of the measurements are of importance


as related to the quality of the results of a test, and which are not. This may lead
to the choice of too high or too low accuracy requirements and further to too
high calibration costs or poor and unreliable results.

- In many cases one is only interested in the measuring equipment and its cali-
bration and uncertainties connected with them without taking into account the
other components of the measurement uncertainty (measuring process, environ-
ment, personnel, object to be measured etc.)

- Calibrations are required for the measuring equipment which do not need it, for
instance, measuring equipment which is used in different direction-giving,
-6-

relative or comparative measurements or in intermediate measurements


which do not affect the quality of the final results.

- Traceability chain and uncertainties of the calibration are not documented.

The purpose of this report is to give views and recommendations to testing labo-
ratories with respect to calibration, and to present tools and data for facilitating
the planning of a practical equipment assurance system.

2 BACKGROUND MATERIAL

In the following, some earlier Nordtest recommendations, as well as principles of


equipment assurance and uncertainty are collected as background material.

2.1 NORDTEST REPORTS ON CALIBRATION

Calibration, its role and effects on the quality of test results are also discussed in
some earlier Nordtest Reports (Andersson 1991, Forstén 1991, Kjell et al. 1993).

Andersson (1991) recommended the following actions:

1. The improvement of personnel competence by the production of a booklet or


manual containing
- introductions to the properties of different types of measuring equipment,

- correct and recommended use of measuring equipment, and sources of


errors,

- basic statistics and error analysis,

- references to standards and requirements.

2. Short guidelines for the planning and structure of the calibration programme
which could contain
- requirements for testing standards and quality systems,
-7-

- the technical performance of a test in view of the calibration needs,


- basis for choice of a calibration level and a calibration procedure,

- establishment of calibration programmes, laboratory standards, calibration


intervals etc.,

- responsibilities, training,

- running the programme, auditing.

3. A manual of calibration methods, adjusted to the needs of a testing laboratory.

Forstén (1991) among others arrived at the following recommendations with


respect to calibration:

1. Calibration of equipment is always done in testing as an integrated part of the


test procedures. Calibration should be a sub item of testing and not an inde-
pendent infrastructure, as is now stressed by accreditation bodies. The same
applies to measuring techniques.

2. The mutual acceptance of test results should be based more and more on the
overall capability to perform the test correctly. The increased use of interlabo-
ratory test comparisons, round robin testing and proficiency testing must be
encouraged.

3. In interlaboratory test comparisons and proficiency testing, the results should


always be analyzed in more detail to identify uncertainty caused by different
factors. It is also important that studies are carried out in which certain para-
meters in the testing are intentionally varied in order to establish their influen-
ce on the final results.

Kjell et al. (1993) reported on the documentation of methods and procedures for
internal calibration. According to them, traceability is essential and must not be
compromised as such. Depending on the purpose of the calibration or check, the
degree of ambition can be modified. They say that interlaboratory test compari-
sons serve well to ensure coordinated measurement capabilities and these are offen
useful but should not be regarded per se as warranting traceability. They also dis-
cuss factors influencing the choice of calibration or check frequency, documenta-
tion and requirements on personnel performing the task.
-8-

A simple method for uncertainty analysis, which is tailored specifically for inter-
nal calibration purposes is presented in their report.

2.2 EQUIPMENT ASSURANCE PROGRAMMES

If tests or measurements are to be reliable, it is essential that each item of testing


or measuring equipment used in the process is known to be operating correctly and
giving results which are reliable in terms of the uncertainty of the measurement
required. Therefore, an equipment validation system, which gives the laboratory
confidence in the results it generates and reports to its clients, is needed in every
testing laboratory. An effective equipment validation system will provide for
(NATA 1991):
(i) a systematic approach to calibration, so that the errors in instruments can be
identified;
(ii) or alternatively, verification of instruments to confirm their compliance with
the specification requirements;
(iii) a traceability which links the laboratory into the national and international
measurement system;
(iv) supported by a laboratory equipment assurance programme which monitors
the reliability of all instruments and items of equipment critical to the performance
of the test or measurement.

A laboratory equipment assurance programme can contain the following ele-


ments (NATA 1991):
* Commissioning
- initial calibration,
- commissioning checks,
- intercomparison with other instruments
* Operational Checks
- regular checks; frequency (daily, weekly, or before each use).
* Periodic Checks
- more extensive checks on accuracy and performance at longer intervals.
* Routine preventive maintenance
* Complete recalibration
* Collection and analysis of historical data prediction.
-9-

Independent of the scope of the equipment assurance programme, careful docu-


mentation of the methods and results is stressed in every item of the programme.
The instruments used in verifications shall be identified, their accuracies and
traceability shall be documented etc.

Fig. 1 shows a schematic presentation of interactions between testing, measure-


ment and calibration including their uncertainties. According to the scheme, a test
in the laboratory usually gives the measurements of more than one basic quantity,
thus, the total uncertainty of a test also consists of more than one uncertainty budg-
et. Equipment accuracy is one of the factors contributing to the uncertainty of a
measurement. The choice of the measuring equipment and the appropriate calibra-
tion should be based on the equipment accuracy requirement, which can be
assessed by subtracting the other uncertainty components from the respective
measurement uncertainty.
• Requirements Interaction between testing and calibration
• Purpose
• Use of the
and their uncertainties
results

Testing Allowed
principles uncertainty Equipment assurance program
(Calibration program)

Testing Optimizing of

- 10 -
procedure the total
containing
individual uncertainty
measurements (Total budget)
Equipment Choice of the
accuracy equipment and Calibration
Equipment
optimum method
Uncertainty calibration
Measurement of a (Other) Factors
procedures measurement affecting the
(Budget) measurement Uncertainty of
uncertainty calibration
• Environment
• Staff
Calculations /
Uncertainty of • Method • Uncertainty of
Collection of
calculations / • Object to be reference
measurement
Software measured standard
results
• Calculation / • Environment
Software • Calibration
• Physical equipment
constants • Staff
Results with • Method
the allowed • Sampling • Equipment
uncertainty • Calculations /
Software
• Physical
constants

Fig. 1. A schematic presentation of interactions between testing, measurement and calibration including their uncertainties.
- 11 -

The process in planning an equipment assurance programme is listed below:

1. The assessment and analysis of the testing processes of the laboratory with
respect to the measurements needed and their effects on the test results.

2. Budgeting of the allowed total uncertainty of the different measurements as


related to each testing process.

3. For each measurement, the selection of such measuring equipment which is able
to cover the required measurement range and optimally is in agreement with the
permitted measurement uncertainty. This requires analyzing the uncertainty budg-
et of the measurement process with the chosen equipment. Thus all components of
uncertainty need to be defined. The accuracy (uncertainty) of the equipment
according to the budget defines the maximum uncertainty allowed in calibration of
the equipment. A transfer ratio 3:1 between measurement uncertainty and calibra-
tion uncertainty is normally used.

4. The choice of calibration equipment depends on the permitted uncertainty of


calibration, which also includes the same type of influence factors as measure-
ment. The level of calibration uncertainty sets the requirements of the calibration
environment, the competence of the personnel, the accuracy of reference standards
etc. On this basis, the choice between in-house calibration and external (official)
calibration services can be made.

5. A definition of the equipment which needs regular calibration as well as a defi-


nition of their calibration places and intervals.

6. Description and documentation of the equipment assurance programme.


- 12 -

2.3 UNCERTAINTY BUDGET

The uncertainty of a measurement is, according to a definition (ISO/TAG 1992),


a parameter, associated with the result of a measurement, which characterizes the
dispersion of the values that could reasonably be attributed to the measured. The
parameter may be, for example, a standard deviation, or the width of a confidence
interval. The uncertainty of measurements includes, in general, many components.
Some of these components may be evaluated from the statistical distribution of the
results of a series of measurements, and can be characterized by experimental stan-
dard deviations. The other components, which can also be characterized by stan-
dard deviations, are evaluated from assumed probability distributions based on
experience or other information. It is understood that all components of uncertain-
ty contribute to the dispersion.

The components of uncertainty of a measurement can be classified to be caused,


for example, by the following origins:
- measurement environment,
- accuracy of measurement equipment (including uncertainty of calibration),
- software used in measurements and calculations,
- measuring staff,
- measuring process,
- object to be measured,
- physical constants.

Sometimes, sampling, although it does not relate directly to the measurement


uncertainty, is the dominating uncertainty factor in the testing or measurement
process.

Each significant component of uncertainty of a measurement should be evaluated


separately and should be taken into account in the calculation of the total uncer-
tainty. It is useful to carefully document the basis of evaluations and all calcula-
- 13 -

tions. These can be used in assessing the effects of the different uncertainty com-
ponents on the total uncertainty when optimizing the measuring process. EXCEL-,
LOTUS- or other calculation tables which are easy to use for optimization of the
measurement uncertainty are recommended.

It is important that the testing or measurement procedure is so detailed that the


same maximum uncertainty is never exceeded when the procedure is followed.
From this it follows that, if there is a need to change the uncertainty, it always
requires changes in the procedure. Consequently, if the procedure is changed it
always affects the uncertainty.

3 CALIBRATION OF MEASURING EQUIP-


MENT FOR THE BASIC QUANTITIES

This chapter deals with the measurement of the basic quantities; length, mass,
force, time, temperature, voltage, resistance and current, and gives for each quan-
tity a set of tables / graphical presentations that contain the uncertainty levels,
which can readily be reached in different calibration organizations, and the uncer-
tainty levels of different measuring instrument/equipment types. These tables are
intended to help define suitable measuring equipment and its calibration level.
Under each quantity, also some advice is given as regards the calibration of the
instruments.

In addition, an assessment of the factors with respect to their weight in the total
uncertainty is discussed. Since the uncertainty of a test may consist of several
measurements - and an uncertainty of a measurement always depends on the whole
measuring process, including measuring staff, objects to be measured, environ-
ment etc. - this report only identifies factors contributing to the total uncertainty.
Contribution of the uncertainty, which relates to calibration of equipment to the
- 14 -

total uncertainty of a test, is in many cases, so small that its optimization and
detailed budgeting is marginal.

3.1 LENGTH

Uncertainty levels of different calibration organizations performing length calibra-


tions are given in Fig. 2. The shadowed areas are approximated working ranges of
different laboratory types. If, for example, there is a need to measure dimensions
of around 1 mm, and the permitted relative uncertainty of measurement is 10-3, cal-
ibration in an accredited calibration laboratory is recommended.

Typical attainable uncertainty levels for different instrument types/groups used in


length measurement are given in Fig. 3. The figure can be used in the selection of
measuring instrument. If, for example, there is a need to measure in the range
around 1 mm and the permitted uncertainty of measurement is 10-3, length trans-
ducers or dial gauges can be used, and, on the basis of Fig. 2, it is advisable that
their calibration be carried out in an accredited calibration laboratory.

Checking of length-measuring equipment without any special calibration


apparatus

Verification of the scale of some simple length-measuring instruments, like gradu-


ated rules and tape measures, can be made by using a comparison method. Possible
references are, for example, standard paper sheets. The size of an A4 sheet is 210
x 297 mm. It has an accuracy greater than 1 %, because of the certified quality sys-
tems of the paper mills. Generally the tolerance range is much smaller. Also,
square-ruled papers are standardized in many countries, thus square dimensions
can be used as the reference in the checking of rule or tape scales.
- 15 -

Length, uncertainty levels of different calibration laboratories

10-1
In-house calibration
10-2
Relative uncertainty

10-3

10-4
Accredited calibration
laboratory
10-5

10-6

10-7
National standards laboratory
10-8

1µm 1mm 1m 1km 1000km


Measuring range

Fig. 2. Uncertainty levels of different organizations performing length calibration (CMA 1993a, b).

Length measurement
A = Scanning electron I = Base lines meas. with an
microscope interference comparator
B = Laser interferometer J = Base lines / Invar wires
10-1 C = Gauge blocks K = Mekometer
D = Transducer, dial gauge L = Laser geodimeter
H E = Micrometer M = GPS (Military)
10-2 F = Vernier Calliber N = Interferometry microwaves
G = Steel rule flight time measurements
A G
Relative uncertainty

H = Tape measure O = GPS (Navigation)


10-3
O
D F
10-4
E N
C

10-5
B K
10-6
B L
I
10-7
Quartz meter (abs) M
J
Quartz meter (rel)
10-8

1µm 1mm 1m 1km 1000km


Measuring range

Fig. 3. Typical attainable uncertainty levels for different length-measuring instruments (CMA 1993b, etc.).
- 16 -

Calibration of equipment when an uncertainty of 0.2 mm is allowed and the


measuring range is under 10 m

This is probably the most commonly used accuracy level and measuring range in
testing laboratories. Graduated rules, tape measures and sometimes vernier cal-
lipers are used in the measurements. In-house calibration of these instruments is
recommended to be carried out by using the comparison to a reference steel rule
(length 1 - 3 m), which is calibrated, for example, every second year in an accred-
ited calibration laboratory.

Calibration of equipment when an uncertainty of 0.01 mm is allowed and the


measuring range is under 100 mm

Usually vernier callipers, micrometers, dial gauges and length indicators are used
in this accuracy level and measuring range. If the laboratory only has one or a few
pieces of each instrument type, it is in generally recommended to calibrate these in
an accredited calibration laboratory. If there are plenty of these instruments and
they are frequently used, a special length-measuring apparatus for inhouse cali-
bration is recommended. If the use of the instruments is less frequent, a pertinent
series of reference gauge pieces can be used in calibration.

Calibration of equipment when uncertainties in the range around 0.001 mm


are required

Measuring equipment with an accuracy of 1 µm or better require very accurate cal-


ibration apparatus (gauge pieces, laser-interferometer), a very carefully controlled
environment and competent calibration personnel. If the laboratory is able to carry
out measurements of such small uncertainty, it usually also has competent person-
nel for calibrations. The choice between in-house calibration and an accredited cal-
ibration laboratory depends on the amount of equipment and its use.
- 17 -

Calibration intervals

Calibration intervals depend on the type and use of the equipment. For length
measuring instruments it usually varies between 3 and 12 months.

Factors affecting the uncertainty of length measurement

Typical uncertainty sources for different instrument groups used in length meas-
urement are:

- measuring environment: temperature differs markedly from the room tempera-


ture, dust, vibrations etc.,
- calibration of measuring equipment,
- measuring equipment itself: instability, wear, defects,
- software used in measurements and calculations: programme errors,
- measuring staff: reading error, adjustment error, position error, parallax error
etc.,
- measuring process: wrong measuring tools, number and objects of measure-
ments, calculation errors,
- object to be measured: surface roughness, irregularity of form/shape, rigidity,
impurities on the surface, magnetism, temperature differs from 20 °C, instabi-
lity,
- physical constants: errors in constants used in possible calculations.

The assessment of the weights of different uncertainty components in total meas-


urement uncertainty is process dependent, and should be evaluated individually in
each measuring process.
- 18 -

3.2 MASS

The OIML (International Organization of Legal Metrology) has defined and clas-
sified the mass standards and test weights used in the calibration of weighing
machines. The series of mass standards are identified in accuracy order (beginning
from most accurate), with the letter codes E1, E2, F1, F2, M1, and M2. The Relative
uncertainties of these standards as a function of measuring range are given as
curves in Fig. 4. The figure also gives an example of the capability range of a
national standards laboratory and a field calibration laboratory. In the calibration
of weights or mass standards, the reference shall be at least one accuracy class
higher than the weight to be calibrated. The verification officers, for example, use
F1 series for M1 and F2 series for M2.

Mass calibration
1
OIML reference Verification officers:
mass standards F1 series for M1 weights
M1 F2 series for M2 weights
10-1
F2

F1
10-2
E2

E1 Uncertainty depending on
Relative uncertainty

10-3 the amount of scale divisions


in the weighing machine

1000
10-4 3000 Field calibration of
5000 weighing machines
10000
(A calibration lab.)

10-5

10-6

10-7

National standards laboratory


10-8
National 1kg standard
10-9
1mg 10 100 1g 10 100 1kg 10 100 1t 10 100
Measuring range

Fig. 4. Uncertainty levels of different calibration bodies and accuracies of mass


standards/standard weights (CMA 1993b, OIML recommendations).
- 19 -

Weighing equipment classification

Weighing equipment classes generally follow the designations indicated in Table 1


(Harris 1993). The class of the equipment depends on the scale interval (d), which
also defines the resolution, and the number of divisions on the scale. The typical
use of different equipment classes is also given in the table.

Table 1. Weighing equipment classes and their typical use (Harris 1993)

Recommendations for weighing equipment calibration

The accuracy and traceability of weighing equipment (balances, scales etc.), as


well as the associated uncertainty, are determined through the use of mass stan-
dards or test weights (i.e. lower quality mass standards) in the verification process.
The tolerances applied to mass standards must be proportionally smaller than the
tolerances of the devices that they are being used to verify. The error of the stan-
dard should generally be less than 1/3 of the tolerance of the device being tested
(3:1 transfer ratio).
- 20 -

In some cases, higher transfer ratios, like 4:1, are required. Where it is not techni-
cally possible to ensure this transfer ratio of device accuracy to test weight toler-
ance, it is recommended that a calibration of the test weights is performed. The
assigned values, rather than the nominal values, should then be used to adjust the
accuracy of the weighing instrument. Table 2 gives some information on mass
standard accuracy classes and their typical use (Harris 1993).

Table 2. OIML mass standard and test weight accuracy classes and their typical
use (Harris 1993). The ASTM-, NBS- and NlST-standard classifications are not
referred to in this report.
- 21 -

The verification procedure of a weighing instrument usually includes checking of


the structure and proper functioning of the device as well as checking resolution,
repeatability, loading curve, zero point and effects of angular loading. A one year
verification period is recommended. Partial verifications, such as repeatability
checks, and one-point checks, are recommended to be performed more frequently
- zero checks before each weighing.

There is no official standard for periodic verification of mass standards, which is


used in non-commercial applications, but verification periods between one and
five years are in use. Changing the verification period should be based on calibra-
tion history. The tolerances obtained in official verifications must be acceptable to
non-commercial applications. Verification may be carried out either by direct com-
parison with the standard weights, or by using a substitution method.

Factors affecting the uncertainty of weight measurement

Typical uncertainty sources for different instruments used in weight measurement


are listed in the following:

- measuring environment: humidity, temperature, impurities - like dust, vibra-


tions,
- calibration of measurement equipment: condition of mass standards,
- measuring equipment itself: instability, wear, defects, friction/impurities,
- software used in measurements and calculations: programme errors,
- measuring staff: reading error, adjustment error, loading error, parallax error
etc.,
- measuring process: wrong measuring tools, performance of the measurement,
- object to be measured: density/standard weight density - ratio, surface quality
(porosity), impurities on the surface or adsorbed in the pores, magnetism, sta-
bility/rigidity,
- physical constants: errors in constants used in possible calculations.
- 22 -

The assessment of the weights of different uncertainty components in total meas-


urement uncertainty is process dependent and should be evaluated individually for
each measurement.

3.3 FORCE

The uncertainty levels of different force calibration laboratories/bodies and uncer-


tainty levels of different measuring instruments are given in Fig. 5.

The measuring range of a force transducer is usually about one decade, and trans-
ducers can be constructed for almost any measuring range. Therefore, uncertain-
ties of individual transducers/transducer types are not given here.

1
Uncertainty levels in force measurement and calibration
10-1
Relative uncertainty

10-2 Force measuring systems of testing machines

Force instruments used in verification


of force transducers
10-3 ← Class 2 Calibration
Accredited calibration laboratory ← Class 1
← Class 0.5 of force
← Class 0 instruments
10-4
National standards laboratory

10-5

10-6
1N 10 100 1kN 10 100 1MN 10 100 1GN
Measuring range

Fig. 5. Uncertainty levels of different force calibration laboratories and uncertainty


levels of different measuring instruments (CMA 1993 a, b, EN 10002-2, prEN 10002-3)
- 23 -

Recommendations for force-measuring instrument calibration

In a testing laboratory, force transducers are mostly used in different mechanical


testing machines (for example tensile, compressive, bending, fatigue and creep
testing etc.). The European Standard EN 10 002-2 is usually followed or applied
in verification of the force-measuring system of these machines. There is also a
standard, EN 10 002-3, in preparation for calibration and classification of force
instruments used in the verification of testing machines. For small forces (≥ 500 N),
known masses are also used in verification.

The use of the services of an accredited calibration laboratory is recommended in


the calibration of force-measuring instruments. The recommended calibration
interval is one year (DIN recommends every second year).

Periodical checks with known masses (one weight per force range) can be used in
the verification of the instruments between the calibrations.

Factors affecting the uncertainty in force measurement

The level of uncertainty depends on the equipment and procedures used in the
measurement and calibration, thus, the uncertainty of each measurement should be
evaluated individually.

In the following, the factors which affect the uncertainty in force measurements are
given.

Uncertainty of measuring normal


- calibration uncertainty,
- bridge uncertainty,
- creep.
Uncertainty of calibration method
- measuring equipment,
- dynamic measurement,
- interpolation.
Environment
- temperature (important if the equipment is not temperature compensated),
- temperature gradients.
Uncertainty due to machine to be calibrated
- resolution/reading accuracy,
- line of force application (non axial),
- friction in deflection measuring systems,
- possible periodical errors.
- 24 -

3.4 TIME

The quantity, time, contains both instant time and time interval. Both use second
as their unit. Instant time is Universal Coordinated Time (UTC) from a large group
of atomic clocks. UTC is maintained by BIPM, Paris. The best accuracy of the syn-
chronization to UTC at a National Standards Laboratory is a few hundred nanosec-
onds. Time comparisons between National Standards Laboratories are nowadays
made mainly via GPS (Global Positioning System) satellites. Countries use differ-
ent time zones as their Local Time for practical reasons. Usually local time is dis-
tributed by using radio time signals. The uncertainty of a radio time signal is at best
about ± 0.1 ms.

Absolute time is not a very common object of calibration. Usually, only the
National Standards Laboratories maintain absolute time. Many measurements that
seem to need accurate time need, in fact, only clock synchronization at the begin-
ning. Time interval measurements are common and normal frequency counters
usually measure time intervals as well. Time interval calibrations are thus an
important part of equipment calibrations.

Uncertainty levels of different calibration laboratories performing calibration of


time interval-measuring equipment are given in Fig. 6.
- 25 -

Time interval calibration uncertainties


1E+00

1E-02
Relative uncertainty, dT/T

1E-04

1E-06

1E-08
National standards laboratory,
accredited calibration laboratory
Electrically controlled digital clocks
1E-10 and stop watches

Digital clocks, stop watches

1E-12
1E-09 1E-07 1E-05 1E-03 1E-01 1E+01 1E+03 1E+05 1E+08
Time interval T, s

Fig. 6. Uncertainty levels of different calibration laboratories performing time


interval calibration and uncertainties of different time-measuring equipment.

Uncertainty levels of different measuring instruments

Many time interval measurements are performed by using electronic or mechani-


cal stopwatches. Because they are hand controlled, the accuracy mainly depends
on the measurer (reaction time 0.2 - 1 s) for short periods of time. Only if the meas-
urement takes several hours, and the accuracy requirement is high, is the basic
clock accuracy of 10-4 - 10-6 significant. The accuracy of commercially available
modern stopwatches meets most of the requirements in conventional testing labo-
ratories.

When more accurate time intervals are needed, electronic means of starting and
stopping the time interval counts are used. If the starting and stopping signals are
sharp pulse edges, this can easily be done within a few nanoseconds, even picosec-
onds. In this case, the frequency error of the main clock of the measuring equip-
- 26 -

ment starts to dominate, even in measuring short intervals. If, for instance, a count-
er with an uncompensated crystal oscillator is used for measuring a 1 s interval, the
error from triggering may be about 2 * 10-9, but the error from the main clock is
about 10-5 - 10-6. If more accuracy is needed, a measuring device with an oven crys-
tal oscillator, or locking of the main clock to an accurate external source is recom-
mended.

Recommendations for calibration of instruments used in time interval meas-


urement

Calibration of clocks, watches or other less accurate time-measuring instruments


can easily be done by comparing time intervals given by them to the time interval
between two appropriate time signals given by the radio and documenting the
results of the comparison.

The basic accuracy of a quartz watch is better than 1 min/24 h, thus << 10-3.
Therefore, if the accuracy requirement is smaller than that (> 10-4) and the watch
is known to be reliable, it can be used in time interval measurements without cal-
ibration.

Precise time interval counters should be calibrated at a calibration laboratory.


Functional checks may be made by measuring a known time interval (for instance,
a power network period is 20 ms, a TV line is 64 µs) or measuring arbitrary inter-
vals together with a calibrated device.

Factors affecting the uncertainty in time measurement

The largest contributors to the uncertainties in time and time interval measure-
ments are often the links between the process to be measured and the measuring
equipment (delays or other errors in starting and/or stopping of the measuring
instrument, especially if there are human performances involved) and the sig-
nal/noise level of the start/stop signals.
- 27 -

3.5 TEMPERATURE

Uncertainty levels and utilization ranges of different thermometer types are given
in Fig. 7. If, for example, the required uncertainty of measurement is 0. l °C, one
can use in the measurement either an accurate liquid-in-glass thermometer, an
accurate thermocouple, or a resistance thermometer, but in the calibration of those
instruments only resistance thermometers or fixed point calibration can be used.

Temperature measurement
Liquid-in-glass thermometers
Thermocouples

Resistance thermometers
100
Radiation thermom. / Pyrometers
Allowed uncertainty [°K], [°C]

10

Accredited calibrated laboratories


Thermocouples
rs

In-house calibr.
UT UP Liquid-in-glass ete
m
yro

National standards laboratories


1 P
UL
Resistance Thermometers
UTC
UR
0.1
ULC
Cu
0.01 URC
Fixed point calibration
Ag
Al
0.001
UX = Equipment accuracy range Ar Hg Ga
Sn Zn
UXY = Calibration accuracy range
0.0001 °C
-200 -100 0 100 500 1000 2000 4000
Triple point of water
°K
50 100 200 300 500 1000 2000 4000
Measuring range

Fig. 7. Uncertainty levels (UL,T,R,P), calibration uncertainties (ULC,TC,RC) and


utilization ranges of different thermometer types (CMA 1993 b, ASTM STP 470B,
etc).

The selection of thermometers/thermocouples available is large, and a special type


exists for almost every application. The types most widely used are standardized
(DIN, ASTM etc.). The environment (aggressiveness), accessibility, temperature
- 28 -

range and uncertainty allowed in the measurement are the most important param-
eters in selecting the thermometer.

The uncertainty of calibration depends primarily on the quality of the equipment


available. No clear distinction between uncertainty levels of external (official) cal-
ibration laboratories and in-house calibration can be made.

Recommendations for thermometer calibration

The comparison method is recommended for the in-house calibration of ther-


mometers. A traceably calibrated thermometer and the thermometer/thermocouple
to be calibrated are immersed in a suitable, stirred liquid bath (organic liquid,
water, oil, salt, liquid tin), or put in a suitable laboratory furnace, and the readings
of the thermometers are compared.

The selection of thermometers used in the laboratories and their accuracies and
calibration accuracies are so wide that, in this report, it is impossible to give any
practical general procedure for their calibration.

The choice of calibration temperatures should cover the range which is used in the
temperature measurement. It is important that the sensing head of the thermome-
ter is calibrated and the actuating element of the reference thermometer is at the
same temperature. Metal blocks which have drilled holes for thermocouples, or
other sensing elements, can be used in furnaces for homogenizing the temperature
during calibration.

If two thermocouples are compared in calibration, it is important to use a stable


reference junction temperature (cold point).

Pyrometers can be calibrated by measuring the temperature of an appropriate black


body which is equipped with a reference thermometer and by comparing the read-
ings.
- 29 -

The calibration interval of liquid-in-glass thermometers generally varies between


24 and 36 months. For resistance element or thermocouple-based equipment, cal-
ibration intervals from 6 to 24 months are used.

Factors affecting the uncertainty budget in temperature measurement

The level of uncertainty depends on the equipment and procedures used in the
measurement and calibration, thus, uncertainty of each measurement should be
evaluated individually.

In the following, factors are given which affect the uncertainty in temperature
measurements.

Calibration of measurement equipment


- uncertainty of the reference normal,
- homogeneity and stability of the temperature in the calibration furnaces or
baths,
- differences in time constants between the reference and equipment to be cali-
brated.

Liquid-in-glass thermometers
- reading error, parallax error,
- change rate of the temperature,
- immersion depth,
- hysteresis/elasticity of the glass (temperatures over 100 °C).

Resistance thermometers (Pt-100, Ni-1000 etc.)


- self-heating,
- nonlinearity,
- deviations in nominal resistance,
- stability /ageing.
- 30 -

Thermocouples
- homogeneity and purity of the wires,
- cold-point stability and accuracy,
- stability,
- isolation resistance and isolation defects in the mantel,
- compensation cables, measuring cables, connections,
- immersion depth/axial temperature gradients,
- uncertainty in potential measurements,
- electro-magnetic disturbances.

Surface thermometers
- surface properties; roughness, conductivity,
- thermal mass of the thermometer.

Uncertainty of bridge and indicating instruments


- resolution,
- measuring method (AC, DC),
- frequency,
- linearity,
- stability,
- environment temperature,
- net stability,
- noise.

3.6 ELECTRICAL QUANTITIES; VOLTAGE,


RESISTANCE AND CURRENT

Resistance, DC voltage and current

The national standards for direct voltage and resistance are directly based on the
Josephson and quantum-Hall effects, or indirectly via international comparisons.
Standards of the best accuracy are maintained for the values 1 or 10 volts and 1,
- 31 -

100 or 12906 ohms. The reference standards of calibration laboratories are usually
solid state voltage standards, stable wire-wound resistors and/or multifunction cal-
ibrators and digital multimeters.

DC current is normally traced to voltage and resistance by maintaining shunt resistors


of known values. If accuracy requirements are not very high, calibration laborato-
ries can often rely on the current ranges of stable multimeters and calibrators. High
current values (≥ 10 A) normally require the use of shunt resistors in all measure-
ments.

The uncertainty levels of different calibration organizations performing voltage,


resistance and DC current measuring equipment calibration are given in Figs. 8, 9
and 10, respectively.

Best measurement capability of calibration laboratories


1E-01
National standards laboratory

1E-02 Accredited calibration laboratory

Josephson junction

1E-03
Relative uncertainty, dU/U

1E-04

1E-05

1E-06

1E-07

1E-08
0.001 0.01 0.1 1 10 100 1000 10000
Voltage, volts

Fig. 8. Uncertainty levels of different calibration organizations forming DC-volt-


age measuring equipment calibration.
- 32 -

Best measurement capability of calibration laboratories


1E-01
National standards laboratory

1E-02 Accredited calibration laboratory

Quantum hall
Relative uncertainty, dR/R

1E-03

1E-04

1E-05

1E-06

1E-07

1E-08

1E-09
1E-04 1E-02 1E+00 1E+02 1E+04 1E+06 1E+08 1E+10
Resistance, ohms

Fig. 9. Uncertainty levels (best measurement capabilities) of different calibration


organizations performing resistance-measuring equipment calibration.

Best measurement capability of calibration laboratories


1E-01
National standards laboratory

Accredited calibration laboratory


1E-02
Relative uncertainty, dI/I

1E-03

1E-04

1E-05

1E-06
1E-09 1E-07 1E-05 1E-03 1E-01 1E+01 1E+03
Current, amperes

Fig. 10. Uncertainty levels (best measurement capabilities) of different calibra-


tion organizations performing direct current measuring equipment calibration.
- 33 -

AC voltage and current

National Standards Laboratories maintain high-quality AC/DC transfer standards


for voltage and current. Also, some calibration laboratories achieve the smallest
uncertainty by comparison of AC voltage values to the corresponding DC values.
Very high values (> 1 kV and > 20 A) are often measured by using dividers and
transformers.

The national standards for AC/DC ratio may have an accuracy of 10-6 but an uncer-
tainty of 10-4 is sufficient for most calibrations. The best measurement capability
of the calibration laboratories usually strongly depends on the voltage, current and
frequency values in question.

Uncertainty levels of different measuring instruments

The commonest instrument for measuring electrical quantities is a digital multi-


meter. Their specified measuring uncertainties vary from a few ppm (10-6) to sev-
eral per cent depending on the price, resolution and measuring range.

Figs. 11, 12 and 13 show classification examples where the instruments have been
divided in to two price groups, 80 - 100 USD and 800 - 1000 USD. The uncertainty
ranges which can be reached by using these price-group instruments are given.

The multimeters are commonly calibrated by using a multifunction calibrator. The


calibrator is an accurate voltage and current source. It usually contains a selection
of stable resistance standards. The accuracies of the most accurate multimeters and
calibrators do not notably differ. In some ranges, the accuracies may even be near
the measurement capability of the national standards laboratories.
- 34 -

Typical calibrated multimeter DC voltage performance


1E-01
USD 80 - 100

USD 800 - 1000


1E-02
Relative uncertainty, dU/U

1E-03

1E-04

1E-05

1E-06
0.01 0.1 1 10 100
Measured voltage, volts

Fig. 11. Typical uncertainty ranges attainable by different price-group voltage


meters.
- 35 -

Typical calibrated multimeter resistance performance


1E+00
USD 80 - 100

USD 800 - 1000


1E-01
Relative uncertainty, dR/R

1E-02

1E-03

1E-04

1E-05
1E-02 1E+00 1E+02 1E+04 1E+06 1E+08
Measured resistance, ohms

Fig. 12. Typical uncertainty ranges attainable by different price-group resistance meters.

Typical calibrated multimeter direct current performance


1E+00
USD 80 - 100

USD 800 - 1000


1E-01
Relative uncertainty, dI/I

1E-02

1E-03

1E-04

1E-05

1E-06
1E-05 1E-04 1E-03 1E-02 1E-01 1E+00 1E+01
Measured current, amperes
- 36 -

Recommendations for instrument calibration

By calibration of a multimeter, one often wants to check the validity of the manu-
facturer's specifications. The less accurate meters can be compared to a calibrated
multimeter with well-defined characteristics. However, for full verification of all
ranges, the services of a calibration laboratory are usually needed.

In a good quality system, the method of calibration and calibration interval is clear-
ly described for each measuring instrument. A testing laboratory can send one of
the multimeters or a calibrator, e.g. once a year, to an accredited calibration labo-
ratory. This instrument can be used as the reference standard to which all other
measurements are traced. To ensure the traceability, the comparison results
between all measuring instruments and the reference must be documented.

Coarse functional checking of a multimeter can be done by measuring the known


voltage of a battery, power supply or AC network. Current ranges can be checked
by measuring the current of a known load, and resistance ranges by measuring, e.g.
metal film resistors with known tolerances. If these results are used as a calibra-
tion, the uncertainty is usually large (up to 10 %), but it may still be quite suffi-
cient for some purposes.

Factors affecting the uncertainty in electrical measurements

In the following, factors are given which affect the uncertainty in electrical meas-
urements.

Environment:

- The temperature dependence of the measuring instruments must be taken into


account if temperature variations are large or the measuring accuracy is high.
Temperature gradients also produce thermal voltages in cables, connectors,
switches etc.
- 37 -

- Leakage resistances can depend on air humidity. Also, the properties of some
electronic components may be affected by humidity.

- Electromagnetic interference can produce signals in the measuring circuit.


Sources of interference can be TV or radio stations, motors, transformers, elec-
tronic equipment, other measuring equipment etc.

- Net stability may affect the operation and the internal temperature of the instru-
ments.

Measuring instruments:
- Calibration uncertainty of the measuring instruments.

- Linearity of the instruments whenever the measured values differ from the cali-
bration points.

- Stability of the instruments.

- Resolution of the instruments.

4 SUMMARY AND SUGGESTIONS

The answer to the title question "When do we need calibration of equipment in a


testing laboratory?" is that, calibration of measuring equipment is always needed
when the measurement uncertainty has a detrimental effect on the decisions made
based on the test results. The uncertainty of calibration, however, can be selected
depending on the accuracy requirements.

In selecting measuring equipment and planning of calibration, it is most important


to assess which of the measurements are of importance as related to the quality of
the results of a test and which are not. Even if certain ranges for an influence fac-
tor, such as temperature, are defined in a standard, it does not necessarily mean that
it has an effect on the results of the test.
- 38 -

It is also important to define the measuring equipment which does not need cali-
bration, for instance, measuring equipment which is used in different directiongiv-
ing, relative or comparative measurements, or in intermediate measurements
which do not affect the quality of the final results. Especially in research work,
there are usually several such measurements.

The logical chain of the measurements should be understood and taken into
account when defining the accuracy requirements for the property to be measured,
for the measuring equipment used and for the calibration.

Uncertainties for different measurements should be given in testing/measuring


standards and procedures. Testing results will consequently contain the uncertain-
ty statement. It is important that the testing or measurement procedure is so
detailed that the same uncertainty always applies when that procedure is followed.

It is useful to document the basis of uncertainty evaluations and calculations.


These can be used in assessing the effects of the different uncertainty components
on total uncertainty when optimizing the measuring process.

The choice of measuring equipment and appropriate calibration should be based on


the equipment accuracy requirement, which can be assessed by "subtracting" the
other uncertainty components from the respective measurement uncertainty.

The contribution of uncertainty, which relates to the calibration of equipment to the


total uncertainty of a test, comprising of the measuring process, environment, per-
sonnel, object to be measured etc., is, in many cases, so small that its optimization
and detailed budgeting is of no use.
- 39 -

The process in the planning of an equipment assurance programme could be as


follows:

1. Assessment and analysis of the testing processes of the laboratory with respect
to the measurements needed and their effects on the test results.

2. For each testing process, budgeting of the permitted total uncertainty to the
different measurements.

3. For each measurement, a choice of such measuring equipment which is able


to cover the required measuring range and is optimally in agreement with the
permitted measurement uncertainty. This requires analyzing the uncertainty
budget of the measurement process with the chosen equipment, thus, the
components of uncertainty need to be defined. The accuracy (uncertainty) of
the equipment in the budget defines the maximum uncertainty permitted in the
calibration of the equipment.

4. The choice of calibration equipment depends on the permitted uncertainty of


calibration, which also contains same error sources as the actual measurement.
The level of calibration uncertainty sets the requirements for the calibration
environment, competence of personnel, accuracy of reference standards etc.
On the basis of these, the choice between in-house calibration and accredited
calibration services can be made.

5. Definition of the equipment which needs regular calibration as well as defini-


tion of calibration places and intervals.

6. A description and documentation of the equipment assurance programme shall


be made. Independent of the scope of the equipment assurance programme,
careful documentation of the methods and results is important in every step of
the programme. Instruments verified and used in verifications shall be ident-
ified. Their accuracies and traceability shall be documented etc.
- 40 -

The National Standards Laboratories, together with the accredited calibration lab-
oratories should be encouraged to gather data and experience from accuracy prop-
erties of different measuring equipment, as well as from calibration uncertainties
and distribute the data to the users in the form of an uncertainty chart in the same
way as given in Chapter 3 of this report.

REFERENCES
Andersson, H. 1991. An introductory study of calibration in testing laboratories.
Espoo: Nordtest. 30 p. (NT Techn Report 147).

ASTM. 1981. Manual on the use of thermocouples in temperature measurement.


ASTM STP 470B. American Society for Testing and Materials. 258 p.

BIPM. 1992. Annual Report of the BIPM Time Section, Vol. 5, Paris.

Centre for Metrology and Accreditation (CMA). 1993a. Directory of Accredited


Calibration Laboratories 1993. Helsinki: Centre for Metrology and Accreditation.

Centre for Metrology and Accreditation (CMA). 1993b. Directory of Finnish


National Standard Laboratories 1992. Helsinki: Centre for Metrology and
Accreditation. 74 p. (Publication J4/1993).

EN 10 002-2. 1991. Metallic materials - Part 2: Verification of the force measur-


ing system of the tensile testing machine. European Committee for
Standardization. 13 p.

prEN 10 002-3. 1992. Metallic materials - Part 3: Calibration of force proving


instruments used in uniaxial testing machines. European Committee for
Standardization. 17 p. + app. 7 p.
- 41 -

Forstén, J. 1991. A view on the assessment of the technical competence of testing


laboratories. Espoo: Nordtest. 46 p. (NT Techn Report 149).

Harris, G. 1993. Ensuring Accuracy and Traceability of Weighing Instruments.


ASTM Standardization News, April, p. 44 - 51.

ISO/TAG. 1992. Guide to the expression of uncertainty in measurement.


ISO/TAG 4/WG 3: June 1992. First Edition. ISO/IEC/OIML/BIPM.

Kjell, G., Larsson, P.-O., Larsson, E., Svensson, T. & Torstensson, H. 1993.
Guidelines for in-house calibration. Borås, Sweden: Swedish National Testing
and Research Institute, Materials and Mechanics. 37 p. (Nordtest Project No.
1013-91-6).

NATA. 1991. Quality management in the laboratory. Topic 10, Calibration and
equipment management. LABMAN10.FIN, Issue 1, August 1991. National
Association of Testing Authorities, Australia.

OIML. 1972. International recommendation No. 1 (including amendments),


Cylindrical weights from 1 gram to 10 kilograms (Medium accuracy class, also
called: class M2). International Organization of Legal Metrology. (Official
English translation by Technology Reports Centre, Department of Industry,
Orpington, 1976).

OIML. 1972. International recommendation No. 2 (including amendments),


Rectangular bar weights from 5 kilograms to 50 kilograms (Medium accuracy
class, also called: class M2). International Organization of Legal Metrology.
(Official English translation by Technology Reports Centre, Department of
Industry, Orpington, 1977).

OIML. 1973. International recommendation No. 20. Weights of Accuracy


Classes E1 E2 F1 F2 M1 from 50 kg to l mg. International Organization of Legal
- 42 -

Metrology. (Official English translation by Technology Reports Centre,


Department of Industry, Orpington, 1975).

OIML. 1973. International recommendation No. 25. Standard weights for verifi-
cation officers. International Organization of Legal Metrology.

OIML. 1979. International recommendation No. 47. Standard weights for testing
of high capacity weighing machines. International Organization of Legal
Metrology (English version).
- 43 -

STANDARDS AND OTHER LITERATURE


DEALING WITH EQUIPMENT ASSURANCE

Aaltio, M. & Sillanpää, J. 1992. Calibration system for testing laboratory. Espoo:
Technical Research Centre of Finland. 44 p. + app. 23 p. (VTT Research Notes
1401 (in Finnish).

ASTM. 1988. ASTM Standards on Precision and Bias for Various Applications,
3rd Edition. Philadelphia: American Society for Testing and Materials. 487 p.

ASTM. 1990. Manual on Presentation of Data and Control Chart Analysis, 6th
Edition. ASTM Manual Series: MNL 7. American Society for Testing and
Materials. 106 p.

ASTM. 1993. Annual Book of ASTM Standards, Section 14: General Methods and
Instrumentation. Volume 14.03: Temperature Measurement. American Society for
Testing and Materials. 502 p.

ASTM. 1993. Annual Book of ASTM Standards, Section 3: Metals Test Methods
and Analytical Procedures. Volume 03.01: Metals - Mechanical testing; Elevated
and Low-Temperature Tests; Metallography. American Society for Testing and
Materials. 550 p.

ASTM. 1993. Annual Book of ASTM Standards, Section 3: Metals Test Methods
and Analytical Procedures. Volume 03.03: Nondestructive Testing. American
Society for Testing and Materials. 808 p.

ASTM. 1993. Annual Book of ASTM Standards, Section 14: General Methods and
Instrumentation. Volume 14.02: General Test Methods, Non-metal; Laboratory
Apparatus; Statistical Methods; Forensic Sciences. American Society for Testing
and Materials. 1282 p.
- 44 -

AS 2415. 1980. Calibration system requirements. Standards Association of


Australia. 8 p.

Bennich, P. 1992. Kalibreringsinstruktioner - Usikkerhedsbudgetter - EDB


Måleudstyrsovervgning til ISO 9000 certificering, Udg. 1. Per Bennich Konsulentog
Ingeniørfirma, Vaerløse, September. 96 p. + app.

BS 5781. 1979. Specification for measurement and calibration systems. British


Standards Institution. 5 p.

British Standards Institution. 1984. Manual of British Standards in Engineering


Metrology. British Standards Institution. 233 p. (ISBN 0-09-151931-4).

Calibration of direct-reading contact thermometers (Draft test method), Nordtest


project 951-90 "Calibration of thermometers for use in building services installa-
tion".

DIN. 1990. Längenprüftechnik 2. Lehren (Metrology 2. Standards for gauges),


DIN-Taschenbuch 197 (Handbook 197), 3. Aufl. 312 p. (English translation).

DIN. 1991. Längenprüftechnik 1. Grundnormen, Messgeräte (Metrology 1. Basic


standards and standards for measuring instruments), DIN-Taschenbuch 11
(Handbook 11), 9. Aufl. 400 p. (English translation).

DIN. 1990. Metallische Werkstoffe 1. Bescheinigungen, Prüfgeräte, Pruf-


maschinen, erzeugnisformunabhängige mechanisch-technologishe Prüfverfahren
(Standards for the testing of metals 1. Certificates, Testing apparatus and testing
machines; testing of mechanical properties), DIN-Taschenbuch 19 (Handbook 19),
11. Aufl. 368 p. (English translation).

DIN. 1992. Metallische Werkstoffe 2. Elektrische und magnetische Eigenschaften,


von magnetischen Werkstoffen sowie von Electroblech und -band, Metallographie,
- 45 -

Wirkungen der Wärmebehandlung, Zerstörungsfreie Prüfungen, Strahlenschutz


(Standards for the testing of Metals 2. Electrical and magnetic properties, metal-
lography, effects of heat treatment, wear, non-destructive testing and radiation pro-
tection, DIN-Taschenbuch 56 (Handbook 56), 5. Aufl. 416 p. (English translation).

Handhavande och kalibrering av mätdon. Sveriges verkstadsindustriers förlag.


Uppsala 1992. 301 p. (in Swedish).

Häyrynen, J., Kallio, H. & Kosonen A.-M. 1992. Reference materials in quality of
testing laboratories. Espoo: Nordtest. 35 p. (NT Techn Report 177).

ISO 10012-1. 1992. Quality assurance requirements for measuring equipment. Part
1. Management of measuring equipment. International Organization for
Standardization. 24 p.

ISO/CD 10012-2: Quality assurance requirements for measuring equipment. Part


2: Measurement process control. International Organization for Standardization
(under preparation).

ISO/CD 12102.2: Measurement uncertainty. International Organization for


Standardization (under preparation).

ISO Guide 31. 1981. Contents of certificates of reference materials, 1. Edition.


International Organization for Standardization. 8 p.

ISO Guide 33. 1989. Uses of certified reference materials, 1. Edition. International
Organization for Standardization. 12 p.

ISO Guide 35. 1989. Certification of reference materials - General and statistical
principles, 2. Edition. International Organization for Standardization. 32 p.
- 46 -

ISO-Guide 25. 1990. General requirements for the competence of calibration and
testing laboratories. Third Edition. International Organization for Standardization.
7 p.

ISO Guide 30. 1992. Terms and definitions used in connection of reference mate-
rials, 2. Edition. International Organization for Standardization.

ISO/IEC Guide 58. 1993. Calibration and testing laboratory accreditation systems
-General requirements for operation and recognition, 1. Edition. International
Organization for Standardization. 6 p.

ISO Handbook. 1988. Applied metrology - Limits, fits and surface properties.
International Organization for Standardization. 846 p. (ISBN 92-67-10146-3).

ISO Handbook. 1988. Mechanical testing of metallic materials. International


Organization for Standardization. 436 p. (ISBN 92-67-10142-0).

MIL-STD-45662A Calibration systems requirements. Department of Defence,


USA, 1988. 5 p.

Morris, A. S. 1991. Measurement and Calibration for Quality Assurance. American


Society for Quality Control. 336 p. (ISBN 0-13-567652-5).

NAMAS. 1989. M 10. NAMAS Accreditation Standard. General Criteria of


Competence for Calibration and Testing Laboratories. 24 p.

NAMAS. 1989. M11. NAMAS Regulations. Regulations to be met by Calibration


and Testing Laboratories. Teddington: NAMAS. 8 p.

NIS 7. 1984. NATLAS Information Sheet. Tracebility. Thermometers, thermo-


comples and prt's, NATLAS Executive. Teddington: National Physical Laboratory.
2 p.
- 47 -

Nordtest. 1987. Nordtest method NT MECH 009 Pressure Balances: Calibration.


Espoo: Nordtest. 7 p.

Nordtest. 1989. Nordtest method NT MECH 022 Pressure Gauges: Calibration.


Espoo: Nordtest. 8 p.

Nordtest. 1989. Nordtest method NT MECH 023 Pressure Balances: Gas medium,
calibration. Espoo: Nordtest. 3 p.

Ots, H. 1988. Fastläggande av riktlinjer för en undersökning av hur provningslab-


oratoriets behov av spårbara kalibreringar kan tillgodoses. Borås, Sweden. 10 p. +
app. 7 p. (Nordtest-projekt nr 736-87-1, MPR-M 1988:3).

Pendrill, L. 1992. Traceability, calibration and the uncertainty estimation in meas-


urements for testing, Proc. 1st Eurolab symposium, Eurolab. P. 199 - 208.

Penella, C. R. 1992. Managing the Metrology System. American Society for


Quality Control. 95 p. (ISBN 0-87389-181-3).

Statistical Process Control Manual. American Society for Quality Control,


Automotive Division, 1986. 50 p. (ISBN 0-87389-024-8).

Wheeler, D. J. & Lyday R. W. 1989. Evaluating the measurement process.


Knoxville: SPC Press Inc. 106 p.
TECHNICAL REPORTS FROM THE TECHNICAL GROUP FOR OUALITY ASSURANCE

145 Lindskov Hansen, S., Guidelines for the development of software to be used in test and measuring
laboratories. Espoo 1991. Nordtest, NT Techn Report 145. 46 p.

146 Ohlon, R., About procedures for internal quality audits of testing laboratories. Espoo 1991. Nordtest,
NT Techn Report 146. 39 p.

147 Andersson, H., An introductory study of calibration in testing laboratories. Espoo 1991. Nordtest,
NT Techn Report 147. 30 p.

148 Änko, S. & Sillanpää, J., Requirements on the personnel and organization of testing laboratories.
Espoo 1991. Nordtest, NT Techn Report 148. 70 p. (in Swedish)

149 Forsten, J., A view on the assessment of the technical competence of testing laboratories. Espoo
1991. Nordtest, NT Techn Report 149. 46 p.

164 Sillanpää, J., Requirements of ISO 9002 for testing and inspection. Espoo 1992. Nordtest, NT Techn
Report 164. 42 p.

165 Baade, S., Interpretation of EN 45 001 for fire testing laboratories. Espoo 1992. Nordtest, NT Techn
Report 165. 25 p.

177 Häyrynen, J., Kallio, H. & Kosonen, A.-M., Reference materials in quality assurance of testing
laboratories. Espoo 1992. Nordtest, NT Techn Report 177. 35 p.

178 Änko, S., Qualification system for the personnel of testing laboratories. Espoo 1992. Nordtest,
NT Techn Report 178. 17 p.

179 Ohlon, R., Comparison of standards with requirements on calibration and testing laboratories. Espoo
1992. Nordtest, NT Techn Report 179. 71 p.

187 Dybkær, R., Jordal, R., Jørgensen, P.J., Hansson, P., Hjelm, M., Kaihola, H.-L., Kallner, A., Rustad, P.,
Uldall, A. & de Verdier, C.-H., A quality manual for the clinical laboratory including the elements of a
quality system - Proposed guidelines. Espoo 1992. Nordtest, NT Techn Report 187. 30 p.

190 Piepponen, S., Laboratory quality systems: Actions in cases of complaint and disturbances. Espoo
1992. Nordtest, NT Techn Report 190. 17 p.

191 The development of quality assurance in testing laboratories. Seminar in Copenhagen 4-5 November
1992. Espoo 1992. Nordtest, NT Techn Report 191. 132 p.

196 Änko, S., Job descriptions in testing laboratories. Espoo 1993. Nordtest, NT Techn Report 196. 15 p.

197 Törrönen, K., Sillanpää, J. & Hayrynen, J., Integration of quality assurance into project and laboratory
management. Espoo 1993. Nordtest, NT Techn Report 197. 31 p.

216 Salmi, T., Methods for testing laboratories to evaluate customer satisfaction and enhance the service
quality. Espoo 1993. Nordtest, NT Techn Report 216. 31 p.

217 Kjell, G., Larsson, P-O., Larsson, E., Svensson, T. & Torstensson, H., Guidelines for in-house
calibration. Espoo 1993. Nordtest, NT Techn Report 217. 46 p.

224 Nilsson, U., QA training material for laboratory employees. Nordtest, NT Techn report 224. (In print)

225 Giæver, H., The future role of testing laboratories, product certification bodies and inspection bodies.
Nordtest, NT Techn report 225. (In print)

226 Vitikainen E., When do we need calibration of equipment used in testing laboratories? Espoo 1994.
Nordtest, NT Techn report 226. 47 p.

227 Lindroos, V., Recommendation for general terms and model forms of contracts for testing laboratories.
Espoo 1994. Nordtest, NT Techn report 227. 31 p.

You might also like