Professional Documents
Culture Documents
Calibration
Ultimate Calibration
Ultimate Calibration
Beamex is a technology and service company that develops, manufactures and
markets high-quality calibration equipment, software, systems and services for the
calibration and maintenance of process instruments. The company is a leading
worldwide provider of integrated calibration solutions that meet even the most demanding
requirements. Beamex offers a comprehensive range of products and services-from portable
calibrators to workstations, calibration accessories, calibration software, industry-specific
solutions and professional services. Through Beamex’s global and competent partner network,
their products and services are available in more than 60 countries. As a proof of Beamex’s
success, there are more than 10,000 companies worldwide utilizing their calibration solutions.
Several companies have been Beamex’s customer since the establishment of the company
over 30 years ago. For more information about Beamex and its products and services,
visit www.beamex.com
Ultimate Calibration
Copyright © 2009 by Beamex Oy Ab. All rights reserved.
No part of this publication may be reproduced or distributed in
any form or by any means, or stored in a database or retrieval
system, without the prior written permission of Beamex Oy Ab.
Requests should be directed to info@beamex.com.
Acknowledgements 6
Preface by the CEO of Beamex Group 7
PART 1
Quality standards and industry regulations 11
A basic quality calibration program 27
PART 2
Traceable and efficient calibrations in the process industry 51
How often should instruments be calibrated 65
How often should calibrators be calibrated 73
Automated calibration planning lowers costs 77
Benefits of integrating calibration software with CMMS 83
Fieldbus transmitters must also be calibrated 91
Calibration of weighing instruments Part 1 97
Calibration of weighing instruments Part 2 103
Calibration in hazardous environments 109
Improving power plant performance through calibration 115
Calibration in the pharmaceutical industry 121
The benefits of using a documenting calibrator 129
The safest way to calibrate 137
Why use software for calibration management? 143
Acknowledgements
T
his book is the result of work
that has taken place between
2006 and 2009. A team of
experts in industry and calibration
worldwide has put forth effort to
the creation of this book by writing
articles related to calibration.
On behalf of Beamex, I would
like to thank all of the people who
have contributed to the creation
of this book:
Jay Bucher, Mike Farkas, Heikki
Laurila, Osmo Luotsinen, Pertti
Mäki, Dean Palmer, Aimo Pusa,
David Spitzer and Vance VanDoren.
Additionally, I would like to
thank the team at Beamex, which
has made a huge contribution
to developing the concept and
planning the content of the book,
overseeing the graphic design,
providing technical know-how for
the writers, proof-reading the texts
as well as providing photographs
to illustrate the book. Without the
support of these people, this book
could not have been published.
Thank you!
villy lindfelt
marketing director
beamex group
6
preface by the ceo of beamex group
Preface
C
alibrators, calibration software and other related ‘tools’ have
developed significantly during the past few decades in spite of
the fact that the calibration of measurement devices as such
has existed for several thousands of years.
Presently, the primary challenges of industrial metrology and
calibration include how to simplify and streamline the entire
calibration process, how to eliminate double work, how to reduce
production down-time, and how to lower the risk of human errors.
These all can be tackled by improving the level of system integration
and automation.
Calibration and calibrators can no longer considered as isolated,
stand-alone devices, systems or work processes within a company or
a production plant. Just like any other business function, calibration
procedures need to be automated to a higher degree and integrated to
achieve improvements in quality and efficiency. This is the area where
Beamex as a company aims to be the benchmark in the industry.
This book, Ultimate Calibration, has two main purposes: firstly,
to give some general information about industrial calibration, and
secondly to give a deeper introduction to the ‘world of integrated
and automated calibration’ in the form of independent articles. The
content of the book has been selected and organized in such a way
that as many readers as possible are able to find something useful and
interesting in it.
This book has several professional contributors as stated in the
Acknowledgements but here I want to express my special thanks to
Villy, our Marketing Director, whose original idea the book is and
who was the director of the project. I hope this book will assist you
in learning new things and provide you with fresh ideas. Enjoy your
reading!
raimo ahola
ceo
beamex group
7
PART 1
quality standards and industry regulations
Introduction
B
efore going into what the current standards and regulations
actually state, here is a reminder from times past about
measurement practices and how important they really are.
Immersion in water makes the straight seem bent; but reason, thus
confused by false appearance, is beautifully restored by measuring,
numbering and weighing; these drive vague notions of greater or less
or more or heavier right out of the minds of the surveyor, the computer,
and the clerk of the scales. Surely it is the better part of thought that
relies on measurement and calculation. (Plato, The Republic, 360 B.C.)
There shall be standard measures of wine, beer, and corn…
throughout the whole of our kingdom, and a standard width of dyed
russet and cloth; and there shall be standard weights also. (Clause 35,
Magna Carta, 1215)
When you can measure what you are speaking about and express
it in numbers, you know something about it; but when you cannot
express it in numbers, your knowledge is of a meager and unsatisfactory
kind. It may be the beginning of knowledge, but you have scarcely, in
your thoughts, advanced to the stage of science. (William Thomson,
1st Baron Kelvin, GCVO, OM, PC, PRS, 26 June 1824–17 December
1907; A.K.A. Lord Kelvin).1
One of the earliest records of precise measurement is from Egypt.
The Egyptians studied the science of geometry to assist them in the
construction of the Pyramids. It is believed that about 3000 years B.C.,
the Egyptian unit of length came into being.
11
quality standards and industry regulations
12
quality standards and industry regulations
The QSR
13
quality standards and industry regulations
Please note that each paragraph (a, b, 1 and 2) all have requirements
for documenting everything. In the FDA environment, if there
is no documentation for an action, then the action did not occur.
Documented evidence is critical to showing that something happened,
14
quality standards and industry regulations
15
quality standards and industry regulations
(a) The regulations in this part set forth the criteria under which
the agency considers electronic records, electronic signatures,
and handwritten signatures executed to electronic records to be
trustworthy, reliable, and generally equivalent to paper records
and handwritten signatures executed on paper.
16
quality standards and industry regulations
17
quality standards and industry regulations
18
quality standards and industry regulations
GAMP
Good Automated Manufacturing Practice (GAMP) is a technical
sub-committee, known as a COP (Community Of Practice) of the
International Society for Pharmaceutical Engineering (ISPE). The goal
of the community is to promote the understanding of the regulation
and use of automated systems within the pharmaceutical industry.
The GAMP COP organizes discussion forums for its members. ISPE
organises GAMP related training courses and educational seminars.
Several local GAMP COPs, such as GAMP Americas, GAMP Nordic,
GAMP DACH (Germany, Austria, Switzerland), GAMP Francophone,
GAMP Italiano and GAMP Japan bring the GAMP community closer
to its members, in collaboration with ISPE’s local Affiliates in these
regions.
The GAMP publishes a series of Good Practice Guides for its
members on several topics involved in drug manufacturing. The most
well-known is the The Good Automated Manufacturing Practice (GAMP)
Guide for Validation of Automated Systems in Pharmaceutical Manufacture.
The last major revision (GAMP5) was released in January 2008.
• G
AMP Good Practice Guide: A Risk-Based Approach to Compliant
Electronic Records and Signatures
• G
AMP Good Practice Guide: Calibration Management
• G
AMP Good Practice Guide: Electronic Data Archiving
• G
AMP Good Practice Guide: Global Information Systems Control and
Compliance
• G
AMP Good Practice Guide: IT Infrastructure Control and Compliance
• G
AMP Good Practice Guide: Testing of GxP Systems
• G
AMP Good Practice Guide: Validation of Laboratory Computerized
Systems
• G
AMP Good Practice Guide: Validation of Process Control Systems
19
quality standards and industry regulations
ISO 9001:2008
Basically, this is what is required according to ISO 9001:2008
20
quality standards and industry regulations
ISO 17025
ISO 17025 – General requirements for the competence of testing and calibration
laboratories. According to ISO 17025, this standard is applicable to all
organizations performing tests and/or calibrations. These include first-,
second-, and third-party laboratories, and laboratories where testing
and/or calibration forms part of inspection and product certifications.
Please keep in mind that if your calibration function and/or metrology
department fall under the requirements of your company, rather it be
for compliance to an ISO standard (ISO 9001:2000 or ISO 13485) or
an FDA requirement (cGMP, QSR, etc.), then you do not have any
obligation to meet the ISO 17025 standard. You already fall under a
quality system that takes care of your calibration requirements.
ANSI/NCSL Z540.3-2006
ANSI/NCSL Z540.3-2006 – American National Standard for Calibration-
Requirements for the Calibration of Measuring and Test Equipment.
The objective of this National Standard is to establish the technical
requirements for the calibration of measuring and test equipment.
This is done through the use of a system of functional components.
Collectively, these components are used to manage and assure that
21
quality standards and industry regulations
the accuracy and reliability of the measuring and test equipment are
in accordance with identified performance requirements.
In implementing its objective, this National Standard describes the
technical requirements for establishing and maintaining:
• t he acceptability of the performance of measuring and test
equipment;
• the suitability of a calibration for its intended application;
• the compatibility of measurements with the National Measurement
System; and
• the traceability of measurement results to the International System
of Units (SI).
In the development of this National Standard attention has been
given to:
• expressing the technical requirements for a calibration system
supporting both government and industry needs;
• applying best practices and experience with related national,
international, industry, and government standards; and
• balancing the needs and interests of all stakeholders.
In addition, this National Standard includes and updates the relevant
calibration system requirements for measuring and test equipment
described by the previous standards, Part 11 of ANSI/NCSL Z540.1
(R2002) and Military Standard 45662A.
This National Standard is written for both Supplier and Customer,
each term being interpreted in the broadest sense. The “Supplier may
be a producer, distributor, vendor, or a provider of a product, service,
or information. The “Customer” may be a consumer, client, enduser,
retailer, or purchaser that receives a product or service.
Reference to this National Standard may be made by:
• customers when specifying products (including services) required;
• suppliers when specifying products offered;
• legislative or regulatory bodies;
• agencies or organizations as a contractual condition for procurement;
and
• assessment organizations in the audit, certification, and other
evaluations of calibration systems and their components.
This National Standard is specific to calibration systems. A
calibration system operating in full compliance with this National
Standard promotes confidence and facilitates management of the risks
associated with measurements, tests, and calibrations.8
22
quality standards and industry regulations
23
quality standards and industry regulations
24
26
a basic quality calibration program
A basic quality
calibration program
Why do we calibrate?
R
&D departments are tasked with coming up with the answers
to many problems; the cure for cancer is one of them. Let’s
imagine that the Acme Biotech Co. has found the cure for
cancer. Their R&D section sends the formula to their operations &
manufacturing division. The cure cannot be replicated with consistent
results. They are not using calibrated test instruments in the company.
Measurements made by R&D are different than those made by the
operations section. If all test equipment were calibrated to a traceable
standard, then repeatable results would ensure that what’s made in one
part of the company is also repeated in another part of the company.
The company loses time, money, their reputation, and possibly the
ability to stay in business simply because they do not use calibrated
test equipment. A fairy tale? Not hardly. This scenario is repeated
every day throughout the world.
Without calibration, or by using incorrect calibrations, all of us pay
more at the gas pump, for food weighed incorrectly at the checkout
counter, and for manufactured goods that do not meet their stated
specifications. Incorrect amounts of ingredients in your prescription
and over-the-counter (OTC) drugs can cost more, or even cause
illness or death. Because of poor or incorrect calibration, criminals
are either not convicted or are released on bad evidence. Crime labs
cannot identify the remains of victims or wrongly identify victims
in the case of mass graves. Airliners fly into mountaintops and off
the ends of runways because they don’t know their altitude and/or
27
a basic quality calibration program
speed. Babies are not correctly weighed at birth. The amount of drugs
confiscated in a raid determines whether the offense is a misdemeanor
or a felony; which weight is correct? As one can see, having the correct
measurements throughout any and all industries is critical to national
and international trade and commerce.
The bottom line is this – all test equipment that make a quantitative
measurement require periodic calibration. It is as simple as that.
However, before we go any further, we need to clarify two definitions
that are critical to this subject – calibration and traceability.
By definition:
28
a basic quality calibration program
BIPM
NMIs
Reference standards
Figure 1
SI units
Primary stds.
Secondary standards
Reference standards
Working standards
Figure 2
29
a basic quality calibration program
5 Elements of traceability
5.1 Traceability is characterised by a number of essential elements:
(a) an unbroken chain of comparisons going back to a standard
acceptable to the parties, usually a national or international
standard;
(b) measurement uncertainty; the measurement uncertainty
for each step in the traceability chain must be calculated
according to agreed methods and must be stated so that an
overall uncertainty for the whole chain may be calculated;
(c) documentation; each step in the chain must be performed
according to documented and generally acknowledged
procedures; the results must equally be documented;
(d) competence; the laboratories or bodies performing one or
30
a basic quality calibration program
more steps in the chain must supply evidence for their technical
competence, e.g. by demonstrating that they are accredited;
(e) reference to SI units; the chain of comparisons must end at
primary standards for the realization of the SI units;
(f) re-calibrations; calibrations must be repeated at appropriate
intervals; the length of these intervals will depend on a number
of variables, e.g. uncertainty required, frequency of use, way of
use, stability of the equipment.
5.2 In many fields, reference materials take the position of physical
reference standards. It is equally important that such reference
materials are traceable to relevant SI units. Certification of
reference materials is a method that is often used to demonstrate
traceability to SI units.1
The other document that goes hand-in-hand with this is EA
4/02, Expression of the Uncertainty of Measurement in Calibration.
The purpose of this document is to harmonise evaluation of
uncertainty of measurement within EA, to set up, in addition to the
general requirements of EAL-R1, the specific demands in reporting
uncertainty of measurement on calibration certificates issued by
accredited laboratories and to assist accreditation bodies with a coherent
assignment of best measurement capability to calibration laboratories
accredited by them. As the rules laid down in this document are in
compliance with the recommendations of the Guide to the Expression
of Uncertainty in Measurement, published by seven international
organisations concerned with standardisation and metrology, the
implementation of EA-4/02 will also foster the global acceptance of
European results of measurement.2
By understanding and following both of these documents, a
calibration function can easily maintain traceable calibrations for
the requirements demanded by their customers and the standard or
regulation that their company needs to meet.
To maintain traceability, without using uncertainty budgets or
calculations, you must ensure your standards are at least four times (4:1)
more accurate than the test equipment being calibrated. Where does
this ratio of four to one (4:1) come from? It comes from the American
National Standard for Calibration – (ANSI/NCSL Z540.3-2006)
which states: “Where calibrations provide for verification that measurement
quantities are within specified tolerances…Where it is not practical to estimate
this probability, the TUR shall be equal to or greater than 4:1.”
So, if a TUR of equal to or greater than 4:1 is maintained, then
31
a basic quality calibration program
32
a basic quality calibration program
word for word. Of course not. But they must have their calibration
procedure on hand each time they are performing the calibration. If
a change has been made to that procedure, the calibration technician
must be trained on the change before they can perform the calibration;
and the appropriate documentation completed to show that training
was accomplished and signed off. When the proper training is not
documented and signed off by the trainer and trainee, then it is the
same as if the training never happened.
“Say what you do” means write in detail how to do your job. This
includes calibration procedures, work instructions and standard
operating procedures (SOPs).
“Do what you say” means follow the documented procedures or
instructions every time you calibrate, or perform a function that
follows specific written instructions.
“Record what you did” means that you must record the results of your
measurements and adjustments, including what your standard(s) read or
indicated both before and after any adjustments might be made.
“Check the results” means make certain the test equipment meets
the tolerances, accuracies, or upper/lower limits specified in your
procedures or instructions.
“Act on the difference” means if the test equipment is out of tolerance,
you’re required to inform the user/owner of the equipment because
they may have to re-evaluate manufactured goods, change a process,
or recall a product.3
33
a basic quality calibration program
“Say what you do” means write in detail how to do your job. This
includes calibration procedures, work instructions and SOPs. All of
your calibration procedures should be formatted the same as other
SOPs within your company. Here is an example of common formatting
for SOPs:
1. Procedures
2. Scope
3. Responsibilities
4. Definitions
5. Procedure
6. Related Procedures
7. Forms and Records
8. Document History
After section 4. Definitions, you should have a table listing all of the
instruments or systems that would be calibrated by that procedure,
along with their range and tolerances. After that you should have a
list of the standards to be used to calibrate the items. This table should
also include the standard’s range and specifications. Then the actual
calibration procedure starts in section 5. Procedures.
Manufacturer’s manuals usually provide an alignment procedure
that can be used as a template for writing a calibration procedure. They
should show what standards accomplish the calibration of a specific
range and/or function. A complete calibration must be performed
prior to any adjustment or alignment. An alignment procedure and/
or preventive maintenance inspection (PMI) may be incorporated
into your SOP as long as it is separate from the actual calibration
procedure.
There are, generally speaking, two types of calibration procedures:
Generic: temperature gages and thermometers, pressure and vacuum
gages, pipettes, micrometers, power supplies and water baths.
Specific: spectrophotometers, thermal cyclers, and balances/scales.
Generic SOPs are written to show how to calibrate a large variety of
items in a general context.
Specific SOPs are written to show step-by-step procedures for each
different type of test instrument within a group of items. Possibly, the
calibration form is designed to follow specific steps (number wise);
and removes doubt by the calibration technician on what data goes
into which data field.
“Do what you say” means follow the documented procedures or
instructions every time you calibrate, or perform a function that follows
34
a basic quality calibration program
35
a basic quality calibration program
36
a basic quality calibration program
37
a basic quality calibration program
38
a basic quality calibration program
There should be only one way to file your records, both hard copy
and eRecords – no matter which system you use, put it into your
written procedures.
There are many different ways to manage your calibration data since
there are a variety of ways to collect that data. Hard copy records collected
during the calibration of test instruments have been discussed in detail
already. But the collection of data by electronic means, or through the
use of calibration software, process controllers, etc., should also be
considered. Is the system validated and instrumentation qualified prior
to use? If you are using any type of computerized system, validation of
that software is mandatory. How is the data collected and stored? Is it in
its native format or dumped into a spreadsheet for analysis? All of these
need to be considered to allow for review, analysis, and/or compilation
into your forms, and eventual storage.
The use of computerized data collection brings with it not only
increased productivity and savings in time and effort; but also new
problems in how to collect, manage, review and store the data. It cannot
be emphasized enough the criticality of validating your software, data
lines and storage systems when going entirely electronic with your
calibration records and data management.
“Check the results” means make certain the test equipment meets
the tolerances, accuracies, or upper/lower limits specified in your
procedures or instructions.
There are various ways to do this. Calibration forms should have
the range and their tolerances listed for each piece of test equipment
being calibrated. In some instances it is apparent what the tolerances
will be for the items being calibrated. In other cases it is not quite so
apparent.
39
a basic quality calibration program
40
a basic quality calibration program
Start
‘As found’
test
Save
‘As found’
results
Save
‘As Left’
results
End
41
a basic quality calibration program
42
a basic quality calibration program
43
a basic quality calibration program
44
a basic quality calibration program
45
a basic quality calibration program
As Lord Kelvin was quoted as saying, “If you cannot measure it, you
cannot improve it.”
46
PART 2
traceable and efficient calibrations
1. Introduction
T
oday’s modern process plants, production processes and quality Fortunately, modern
systems, put new and tight requirements on the accuracy of calibration techniques
process instruments and on process control.
Quality systems, such as the ISO9000 and ISO14000 series of quality and calibration systems
standards, call for systematic and well-documented calibrations, with have made it easier to
regard to accuracy, repeatability, uncertainty, confidence levels etc.
fulfill the requirements on
Does this mean that the electricians and instrumentation people
should be calibration experts? Not really, but this topic should instrumentation calibration
not be ignored. Fortunately, modern calibration techniques and and maintenance in
calibration systems have made it easier to fulfill the requirements on
instrumentation calibration and maintenance in a productive way. a productive way.
However, some understanding of the techniques, terminology and
methods involved in calibration must be known and understood in
order to perform according to International Quality Systems.
51
traceable and efficient calibrations
WHAT IS MEASUREMENT?
In technical standards terms the word measurement has been
defined as:
“A set of experimental operations for the purpose of determining
the value of a quantity.”
A set of experimental
operations for the purpose
HIERARCHY
OF ACCURACY
of determining the value of
a quantity. TRUE VALUE
International
National standard
Authorized
Laboratories
Instr. Departments
House and working standards
Process instrumentation
52
traceable and efficient calibrations
EVERYTHING IS BASED
ON MEASUREMENTS
MEASUREMENTS CONTROLS
INSTRUMENTATION
MEASUREMENTS ADJUSTMENTS
53
traceable and efficient calibrations
they drift and lose their ability to give accurate measurements. This
drift makes recalibration necessary.
Environment conditions, elapsed time and type of application can
all affect the stability of an instrument. Even instruments of the same
manufacturer, type and range can show varying performance. One
unit can be found to have good stability, while another performs
differently.
QUALITY MAINTENANCE
QUALITY “GOOD AS NEW”
C1–C7 CALIBRATIONS
C1 C2 C3 C4 C5 C6 C7
QP
QM
Q1 LOWER
TOLERANCE
Q2
Q3
QZM
PURCHASE T1 T2 T3 TIME
QP – PURCHASED QUALITY
QZM – ZERO MAINTAINED QUALITY
QM – MAINTAINED QUALITY
54
traceable and efficient calibrations
5. Traceability
Calibrations must be traceable. Traceability is a declaration stating to
which national standard a certain instrument has been compared.
55
traceable and efficient calibrations
56
traceable and efficient calibrations
Calibration
An unknown measured signal is compared to a known reference
signal.
Validation of measure
Validation ment and test methods
Validation of measurement and test methods (procedures) is generally (procedures) is generally
necessary to prove that the methods are suitable for the intended necessary to prove that
use.
the methods are suitable
for the intended use.
Non-linearity
Non-linearity is the maximum deviation of a transducer’s output from
a defined straight line. Non-linearity is specified by the Terminal
Based method or the Best Fit Straight Line method.
Resolution
Resolution is the smallest interval that can be read between two
readings.
Sensitivity
Sensitivity is the smallest variation in input, which can be detected as
an output. Good resolution is required in order to detect sensitivity.
Hysteresis
The deviation in output at any point within the instrument’s sensing
range, when first approaching this point with increasing values, and
then with decreasing values.
57
traceable and efficient calibrations
Repeatability
Repeatability is the capability of an instrument to give the same
output among repeated inputs of the same value over a period of time.
Repeatability is often expressed in the form of standard deviation.
Temperature coefficient
The change in a calibrator’s accuracy caused by changes in ambient
temperature (deviation from reference conditions). The temperature
coefficient is usually expressed as % F.S. / °C or % of RDG/ °C.
…stability is expressed as
Stability
the change in percentage
Often referred to as drift, stability is expressed as the change in
in the calibrated output percentage in the calibrated output of an instrument over a specified
of an instrument over a period, usually 90 days to 12 months, under normal operating
conditions. Drift is usually given as a typical value.
specified period, usually
90 days to 12 months,
Accuracy
under normal operating
Generally accuracy figures state the closeness of a measured value
conditions. to a known reference value. The accuracy of the reference value is
generally not included in the figures. It must also be checked if errors
like non-linearity, hysteresis, temperature effects etc. are included in
the accuracy figures provided.
Accuracy is usually expressed % F.S. or % of RDG + adder.
The difference between these two expressions is great. The figure
below illustrates two typical pressure specifications:
0.05%FS (blue line)
0.025% of RDG + 0.01%FS (red line)
The only way to compare accuracy presented in different ways is to
calculate the total error at certain points.
Uncertainty
Uncertainty is an estimate of the limits, at a given cover factor (or
confidence level), which contain the true value.
Uncertainty is evaluated according to either a “Type A” or a “Type
B” method. Type A involves the statistical analysis of a series of
58
traceable and efficient calibrations
Type B uncertainty
Type B evaluation of uncertainty involves the use of other means
to calculate uncertainty, rather than applying statistical analysis of a
series of measurements.
It involves the evaluation of uncertainty using scientific judgement
based on all available information concerning the possible variables.
Values belonging to this category may be derived from:
59
traceable and efficient calibrations
The proper use of the available information calls for insight based
on experience and general knowledge. It is a skill that can be learnt
with practice. A well-based Type B evaluation of uncertainty can
be as reliable as a Type A evaluation of uncertainty, especially in
a measurement situation where a Type A evaluation is based only
on a comparatively small number of statistically independent
measurements.
For uncertainty
specifications, there Expanded uncertainty
must be a clear state- The EA has decided that calibration laboratories accredited by members
ment of cover probability of the EA shall state an expanded uncertainty of measurement obtained
by multiplying the uncertainty by a coverage factor k. In cases where
or confidence level.
normal (Gaussian) distribution can be assumed, the standard coverage
factor, k=2, should be used. The expanded uncertainty corresponds to
a coverage probability (or confidence level) of approximately 95%.
For uncertainty specifications, there must be a clear statement of
cover probability or confidence level. Usually one of the following
confidence levels are used:
1 s = 68%
2 s = 95%
3 s = 99%
60
traceable and efficient calibrations
61
traceable and efficient calibrations
62
traceable and efficient calibrations
CONCLUSION
References
[1] ISO9001: 2000 “Quality Management Systems.
Requirements”
[2] 21 CFR Part 11: “Electronic Records; A good, automated
Electronic Signatures”
calibration system
[3] 21 CFR Part 211: “Current Good Manufacturing Practice
for Finished Pharmaceuticals” reduces workload.
63
how often should instruments be calibrated
Plants can improve their efficiency and reduce costs Plants can improve their
by performing calibration history trend analysis. By efficiencies and reduce
doing it, a plant is able to define which instruments
costs by using calibration
can be calibrated less frequently and which should be
calibrated more frequently. Calibration history trend ‘history trend analysis’,
analysis is only possible with calibration software a function available within
that provides this functionality. Beamex® CMX calibration
software.
Using Calibration History Trend Analysis to Adjust Calibration
Intervals of Plant Instrumentation
Manufacturing plants need to be absolutely confident that their
instrumentation products – temperature sensors, pressure transducers,
flow meters and the like – are performing and measuring to specified
tolerances. If sensors drift out of their specification range, the
consequences can be disastrous for a plant, resulting in costly production
downtime, safety issues or possibly leading to batches of inferior quality
goods being produced, which then have to be scrapped.
Most process manufacturing plants will have some sort of
maintenance plan or schedule in place, which ensures that all
instruments used across the site are calibrated at the appropriate times.
However, with increasing demands and cost issues being placed on
manufacturers these days, the time and resources required to carry
out these calibration checks are often scarce. This can sometimes lead
to instruments being prioritised for calibration, with those deemed
65
how often should instruments be calibrated
critical enough receiving the required regular checks, but for other
sensors that are deemed less critical to production, being calibrated
less frequently or not at all.
But plants can improve their efficiencies and reduce costs by
using calibration ‘history trend analysis’, a function available within
Beamex® CMX calibration software. With this function, the plant
can analyze whether it should increase or decrease the calibration
frequency for all its instruments.
Cost savings can be achieved in several ways. First, by calibrating
less frequently where instruments appear to be highly stable according
to their calibration history. Second, by calibrating instruments more
often when they are located in critical areas of the plant, ensuring
Sensors that are found to that instruments are checked and corrected before they drift out
be highly stable need not of tolerance. This type of practise is common in companies that
employ an effective ‘Preventive Maintenance’ regime. The analyses
be re-calibrated as often as of historical trends and how a pressure sensor, for example, drifts in
sensors that tend to drift. and out of tolerance over a given time period, is only possible with
calibration software that provides this type of functionality.
66
how often should instruments be calibrated
67
how often should instruments be calibrated
68
how often should instruments be calibrated
69
how often should instruments be calibrated
SUMMARY
70
how often should instruments be calibrated
71
how often should calibrators be calibrated
A
s a general rule for Beamex’s MC5, starting with a 1-year things to consider when
calibration period is recommended, because the MC5 has a 1-year determining the calibration
uncertainty specified. The calibration period can be changed in
period.
the future, once you begin receiving cumulated stability history, which
is then compared to the uncertainty requirements. In any case, there are
many issues to be considered when deciding a calibrator’s calibration
period, or the calibration period for any type of measuring device. This
article discusses some of the things to be considered when determining
the calibration period, and provides some general guidelines for making
this decision. The guidelines that apply to a calibrator, also apply to
other measuring equipment in the traceability chain. These guidelines
can also be used for process instrumentation.
An important aspect to consider when maintaining a traceable
calibration system is to determine how often the calibration equipment
should be recalibrated. International standards (such as ISO9000,
ISO10012, ISO17025, CFRs by FDA, GMP, etc.) require the use
of documented calibration programs. This means that measuring
equipment should be calibrated traceably at appropriate intervals and
that the basis for the calibration intervals should be evaluated and
documented.
When determining an appropriate calibration period for any
measuring equipment, there are several things to be considered. They
are discussed below.
73
how often should calibrators be calibrated
Uncertainty need
One of the first things to evaluate is the uncertainty need of the
customer for their particular measurement device. Actually, the initial
selection of the measurement device should be also done based on this
evaluation. Uncertainty need is one of the most important things to
consider when determining the calibration period.
Stability history
In critical applications, the When the customer has evaluated his/her needs and purchased suitable
costs of an out-of-tolerance measuring equipment, (s)he should monitor the stability history of the
measuring equipment. The stability history is important criteria when
situation can be extremely deciding upon any changes in the calibration period. Comparing the
high (e.g. pharmaceutical stability history of measuring equipment to the specified limits and
uncertainty needs provides a feasible tool for evaluating the calibration
applications) and therefore
period. Naturally, calibration management software with the history
calibrating the equipment analysis option is a great help in making this type of analysis.
more often is safer.
The cost of recalibration vs. consequences
of an out-of-tolerance situation
Optimizing between recalibration costs and the consequences of an out-
of-tolerance situation is important. In critical applications, the costs of
an out-of-tolerance situation can be extremely high (e.g. pharmaceutical
applications) and therefore calibrating the equipment more often is
safer. However, in some non-critical applications, where the out-of-
tolerance consequences are not serious, calibration can be made less
frequently. Therefore, evaluating of the consequences of an out-of-
tolerance situation is something to be considered. The corrective actions
in such a case should also be made into an operating procedure.
Some measurements in a factory typically have more effect on a
product quality than others, and therefore some measurements are
more acute than others and should be also calibrated more often than
others.
74
how often should calibrators be calibrated
SUMMARY
75
automated calibration planning lowers costs
Automated calibration
planning lowers costs
C
alibration is an essential element of any instrumentation Calibration operations can
maintenance program. However, sometimes calibration be long and tedious, even
operations can be long and time-consuming. By planning the
process and adding the right tools, efficiency can be improved and with the aid of an electronic
costs lowered substantially. calibrator that automates
Accumulated wear and random variations in a sensor’s environment
the tests.
will inevitably reduce its accuracy over time, so periodic testing is
required to guarantee that the measurements being reported actually
match the conditions being monitored. Otherwise, any computerized
monitoring or control systems, to which the sensor is interfaced, will
be unable to detect off-spec conditions and the quality of the product
being manufactured will suffer.
Unfortunately, calibration operations can be long and tedious,
even with the aid of an electronic calibrator that automates the tests.
The sheer volume of data that must be collected and analyzed can be
overwhelming when there are hundreds of sensors to be checked and
multiple data points to be recorded for each.
For instance...
The experience of Croda Chemicals Europe (Nr. Goole, East Yorkshire,
UK) is typical. They use pressurized vessels to purify lanolin for health
care and beauty products. Each vessel needs to be certified at least
once every two years to demonstrate that it is safe and structurally
sound. That includes a functionality check on all of the pressure
instrumentation, as well as on the sensors that monitor the incoming
chemical additives and the outgoing effluent.
77
automated calibration planning lowers costs
Calibration planning
Calibration software like CMX can also help with the planning of
calibration operations. Calibration schedules take into account the
accuracy required for a particular sensor and the length of time during
which it has previously been able to maintain that degree of accuracy.
Sensors that are found to be highly stable need not be re-calibrated as
often as sensors that tend to drift.
The trick is determining which sensors should be re-calibrated after
a few hours, weeks, or years of operation and which can be left as-is for
78
automated calibration planning lowers costs
79
automated calibration planning lowers costs
80
automated calibration planning lowers costs
81
benefits of integrating calibration software with cmms
F
or process manufacturers today, having a reliable, seamlessly management strategy
integrated set of IT systems across the plant, or across
should be the calibration
multiple sites, is critical to business efficiency, profitability
and growth. of process instrumentation.
Maintaining plant assets – whether that includes production line
equipment, boilers, furnaces, special purpose machines, conveyor
systems or hydraulic pumps – is equally critical for these companies.
This is particularly true if the company is part of an asset-intensive
industry, where equipment and plant infrastructure is large, complex
and expensive. Also, if stoppages to production lines due to equipment
breakdowns are costly, implementing the latest computerised
maintenance management (CMM) systems can help save precious
time and money.
In the process industries, a small but critical part of a company’s
asset management strategy should be the calibration of process
instrumentation. For this, Beamex’s calibration management software,
Beamex® CMX, has proved itself time and time again across many
industry sectors, including pharmaceuticals, chemicals, nuclear, metal
processing, paper, oil and gas. Manufacturing plants need to be sure
that their instrumentation products – temperature sensors, pressure
transducers, flow meters and the like – are performing and measuring
to specified tolerances. If sensors drift out of their specification
83
benefits of integrating calibration software with cmms
84
benefits of integrating calibration software with cmms
CMMS
CMX
Connector
Beamex® CMX
85
benefits of integrating calibration software with cmms
86
benefits of integrating calibration software with cmms
87
benefits of integrating calibration software with cmms
SUMMARY
88
benefits of integrating calibration software with cmms
89
fieldbus transmitters must also be calibrated
Fieldbus transmitters
must also be calibrated
F
ieldbus is becoming more and more common in today’s
instrumentation. But what is fieldbus and how does it differ from
conventional instrumentation? Fieldbus transmitters must be
calibrated as well, but how can it be done? Until now, no practical
solutions have existed for calibrating fieldbus transmitters, but now
Beamex has introduced the world’s first portable fieldbus calibrator –
the MC5 Fieldbus Calibrator.
Conventional transmitters can deliver only one simultaneous
parameter, one way. Each transmitter needs a dedicated pair of cables,
and I/O subsystems are required to convert the analog mA signal to a
digital format for a control system.
Fieldbus transmitters are able to deliver a huge amount of
information via the quick two-way bus. Several transmitters
can be connected to the same pair of wires. Conventional I/O
systems are no longer needed because segment controllers connect the
instrument segments to the quicker, higher-level fieldbus backbone.
91
fieldbus transmitters must also be calibrated
History of fieldbus
Back in the 1940s, instrumentation utilized mainly pneumatic signals
to transfer information from transmitters. During the 1960s, the mA
signal was introduced, making things much easier. In the 1970s,
computerized control systems began to make their arrival. The first
digital, smart transmitter was introduced in the 1980s, using first
proprietary protocols. The first fieldbus was introduced in 1988, and
throughout the 1990s a number of various fieldbuses were developed.
The Foundation Fieldbus During the 1990s, manufacturers battled to see whose fieldbus
and Profibus have begun would be the one most commonly used. A standard was finally set
in the year 2000 when the IEC61158 standard was approved. The
to clearly dominate the Foundation Fieldbus H1 and the Profibus PA, both used in process
fieldbus market. instrumentation, were chosen as standards.
For the most part, one can say that the Foundation Fieldbus is
dominating the North American markets and the Profibus is the
market leader in Europe. Other areas are more divided. There are also
certain applications that prefer certain fieldbus installations despite
the geographical location.
Future of fieldbus
Currently, a large number of fieldbus installations already exist and
the number is increasing at a huge rate. A large portion of new projects
is currently being carried out using fieldbus. Critical applications and
hazardous areas have also begun to adopt fieldbus.
The Foundation Fieldbus and Profibus have begun to clearly
dominate the fieldbus markets. Both Foundation Fieldbus and Profibus
have reached such a large market share that both buses will most likely
remain also in the future. The development of new fieldbuses has
slowed down and it is unlikely that new fieldbus standards will appear
in the near future to challenge the position of Foundation Fieldbus
or Profibus.
Recent co-operation between Foundation Fieldbus and Profibus
suppliers will further strengthen the position of these two standards.
92
fieldbus transmitters must also be calibrated
93
fieldbus transmitters must also be calibrated
94
fieldbus transmitters must also be calibrated
95
calibration of weighing instruments part 1
From the point of view of the owner, weighing In any case, it is the
instruments, usually called scales or balances, owner or the user of
should provide the correct weighing results. How
the instrument that carries
the weighing instrument is used and how reliable
the weighing results are can be very different. Using the final responsibility of
weighing instruments for legal purposes must have measurement capability
legal verification. and who is also responsible
for the processes involved.
I
f a weighing instrument is used in a quality system, the user must
define the measurement capability of it. In any case, it is the owner
or the user of the instrument that carries the final responsibility of
measurement capability and who is also responsible for the processes
involved. (S)He must select the weighing instrument and maintenance
procedure to be used to reach the required measurement capability.
From a regulatory point of view, the quality of a weighing instrument
is already defined in OIML regulations, at least in Europe. Calibration
is a means for the user to obtain evidence of the quality of weighing
results, and the user must have the knowledge to apply the information
achieved through calibration.
97
calibration of weighing instruments part 1
material. The features may vary slightly from country to country, but
in the EU they are the same, at least at the stage when the weighing
instrument is being introduced into use.
Verification and calibration abide by a different philosophy.
Calibration depicts the deviation between indication and reference
(standard) including tolerance, whereas verification depicts the
maximum permissible amount of errors of the indication. This is a
feasible practice for all weighing. The practical work for both methods
is very similar and both methods can be used to confirm measurement
capability, as long as legal verification is not needed. The terminology
and practices used previously for verifying measurement capability,
and for weighing technology in general, are based on these practices of
We must remember calibrating and verifying, even if it was a question of general weighing
that the quality of the (non-legal).
evaluation of measuring
Confirmation is the collecting of information
tolerance depends on
Confirming the capability of weighing instruments should happen
the collected information
by estimating the quality of the measuring device in the place where
through calibration. it will be used. In practice, this means investigating the efficiency of
the weighing instrument; this operation is known as calibration (or
verification). One calibration provides information on a temporary
basis and a series of calibrations provides time-dependent information.
The method of calibration should be selected such that it provides
sufficient information for evaluating the required measuring tolerance.
The method should be precise for achieving comparable results during
all calibrations.
Comparing the indication of weighing instruments with a set
standard gives the deviation or error. However, to be able to define the
measuring tolerance, we need more information about the weighing
instrument, such as repeatability, eccentric load, hysteresis, etc. We
must remember that the quality of the evaluation of measuring
tolerance depends on the collected information through calibration.
Using a calibration program, which goes through the same steps for
every calibration – calculates deviation and measuring tolerance, and, if
necessary, produces a calibration certificate – is the best way to achieve
reliable information to use in comparisons. This type of program
is able to store all the history of calibrated weighing instruments,
including information for other measuring devices. It is also handy
for monitoring measuring systems. The most important aspect of a
98
calibration of weighing instruments part 1
99
calibration of weighing instruments part 1
SUMMARY
aimo pusa
lahti precision
finland
100
calibration of weighing instruments part 1
101
calibration of weighing instruments part 11
W
eighing is a common form of measurement in commerce, Weighing instruments are
industries and households. Weighing instruments are often highly accurate, but
often highly accurate, but users, i.e. their customers and/or
regulatory bodies, often need to know just how inaccurate a particular users, i.e. their customers
scale may be. Originally, this information was obtained by classifying and/or regulatory bodies,
and verifying the equipment for type approval. Subsequently, the
often need to know just
equipment was tested or calibrated on a regular basis.
how inaccurate a particular
Typical calibration procedures scale may be.
Calibrating scales involves several different procedures depending
on national- and/or industry-specific guidelines or regulations, or on
the potential consequences of erroneous weighing results. One clear
and thorough guide is the EA-10/18, Guidelines on the Calibration
of Non-automatic Weighing Instruments, which was prepared by
the European Co-operation for Accreditation, and published by the
European Collaboration in Measurement and Standards (euromet).
Typical scale calibration involves weighing various standard weights
in three separate tests:
• repeatability test
• eccentricity test
• weighing test (test for errors of indication)
In the pharmaceutical industry in the United States, tests for
determining minimum weighing capability are also performed.
103
calibration of weighing instruments part 11
scale may be. The Weighing Test examines the error of the indication on the scale
for several predefined loads. This enables you to correct the errors and
definitions for non-linearity and hysteresis.
If the scale’s maximum load limit is extremely large, it may be
impractical to use standard weights for calibrating the entire range.
In such a case, suitable substitution mass is used instead. Substitution
mass should also be used if the construction of the scale does not allow
the use of standard weights.
104
calibration of weighing instruments part 11
be about the error found at each point of calibration. There are several
sources of uncertainty of the error, e.g.:
1.8 g 3.2 g
1.1 g 3.9 g
0.4 g 4.6 g
2.5 g
Example: The calibration error
and its uncertainty at the
±u(E)
calibration point of 10 kg may
68.27%
be expressed e.g. E = 2.5 g
and u(E) = ±0.7 g, which means
U(E) = 2u(E)
95.45 % that the calculated error in the
indication is 2.5 g and the actual
U(E) = 3u(E) error, with a coverage probability
99.73 % of 68.27%, is between is between
1.8 g and 3.2 g.
105
calibration of weighing instruments part 11
106
calibration of weighing instruments part 11
107
calibration in hazardous environments
Calibration in
hazardous environments
S
triking a match in an environment that contains combustible gas is Many materials and fluids
nothing short of dangerous – personal injury and property damage used in seemingly “safe”
are likely consequences. Improperly calibrating an instrument in
this hazardous environment can be almost as dangerous. industries are themselves
The materials and fluids used in some processes can be hazardous in flammable.
the sense that they can ignite or explode. For example, hydrocarbons
in mines, oil refineries, and chemical plants are flammable and are
typically contained within vessels and pipes. If this were truly the case,
an external flame would not ignite the hydrocarbons. However, in
many locations, leaks, abnormal conditions, and fluid accumulation
may allow hydrocarbons to be present such that the flame could ignite
the hydrocarbons with disastrous results.
Hydrocarbons and other flammable fluids are not limited to the
petroleum and chemical industries. For example, combustible fuels,
such as natural gas, are used in all industries, including agriculture,
food, pharmaceuticals, power generation, pulp/paper, water/
wastewater, universities, retail, and in the home.
In addition, many materials and fluids used in seemingly “safe”
industries are themselves flammable. Even seemingly safe water
treatment systems use combustible materials such as chlorine in their
processes. This means that certain areas of a water treatment plant
may well be considered hazardous. Similarly, certain areas of food
plants, such as reactors that hydrogenate oils, may pose hazards as
well. Therefore, it is important for plants to examine their processes
109
calibration in hazardous environments
110
calibration in hazardous environments
111
calibration in hazardous environments
112
calibration in hazardous environments
113
improving power plant performance through calibration
115
improving power plant performance through calibration
116
improving power plant performance through calibration
flow to a heat exchanger indicates that either the steam trap on the heat
exchanger leaks (wasting steam), or that the process changed (wasting
steam) and needs to be investigated to reduce steam consumption.
Field calibrators help ensure that these instruments operate properly
and accurately quantify energy savings. In addition, instruments that
are regularly and accurately calibrated often tend to improve plant
safety and reduce the probability of equipment damage.
Further, automated portable calibrators can improve the calibration
process by automating the generation of the transmitter inputs and the
recording of the transmitter measurements. This makes the calibration
process much less time-consuming and improves the accuracy with
which data is collected by reducing the probability of human error.
Expensive qualification requirements often preclude the opening
of transmitters in nuclear power plants. In these applications, the
calibration process can be performed faster and more accurately
with an automated calibrator as compared to using existing manual
calibration techniques.
The Beamex® MC5 multifunction calibrators and the Beamex®
CMX calibration software form an integrated, automated calibration
system. Calibrations performed using automated field calibrators
117
improving power plant performance through calibration
118
improving power plant performance through calibration
119
calibration in the pharmaceutical industry
Calibration in the
pharmaceutical industry
Pharmaceutical Operations
T
he pharmaceutical industry produces products that directly The pharmaceutical
affect the life of the majority of the billions of people that inhabit industry produces products
the Earth. As such, a seemingly small mistake or failure could
adversely affect the health of thousands of people. Regulators in the that directly affect the life
pharmaceutical industry recognize these stakes and have implemented of the majority of the billions
various regulations to ensure the integrity of pharmaceutical processes
of people that inhabit the
and, hence, the safety and efficacy of the pharmaceutical products on
which billions of people rely. Earth.
Individuals who are not directly associated with the pharmaceutical
industry should take note because some aspects of these regulations
are being adopted in the process industries. Therefore, it would be
pragmatic to use equipment and adopt practices that either meet or
can easily be upgraded to meet pharmaceutical requirements.
Some high-volume pharmaceuticals are often manufactured
using continuous processing techniques; however, pharmaceutical
manufacturing is typically performed in batches. As such, these
processes typically incorporate many pressure and temperature
measurements such as local indicators (gauges), transmitters and
switches. Many of these measurements are at extreme conditions such
as may be found in an autoclave. While there may be some flowmeters,
batch processes typically incorporate weighing instruments to
implement material additions. Some processes involve clean rooms
where the measurement of low differential pressures is important.
Process measurements can be critical to ensure product quality.
121
calibration in the pharmaceutical industry
122
calibration in the pharmaceutical industry
123
calibration in the pharmaceutical industry
MC5 / PDA / MC5 / PDA /
Manual Entry Manual Entry
NO YES
Adjustment Calibration
OK?
As Left
NO
MC5 / PDA /
Manual Entry
124
calibration in the pharmaceutical industry
125
calibration in the pharmaceutical industry
SUMMARY
Proper calibration and • Proper calibration and its documentation are important for
maintaining the safety and efficacy of pharmaceuticals.
its documentation are
important for maintaining • Portable Beamex calibrators for hazardous locations are
the safety and efficacy designed for use in virtually all pharmaceutical plants.
of pharmaceuticals.
• Beamex software follows the guidelines of 21 CFR Parts 11
and 211 and makes calibration and its documentation easy.
126
calibration in the pharmaceutical industry
127
128
the benefits of using a documenting calibrator
Why calibrate?
For process manufacturers, regular calibration of instruments across The key final step in
a manufacturing plant is common practice. In plant areas where
any calibration process
instrument accuracy is critical to product quality or safety, calibration
every six months – or even more frequently – is not unusual. – documentation – is often
However, the key final step in any calibration process – neglected or overlooked
documentation – is often neglected or overlooked because of a lack
of resources, time constraints or the pressure of everyday activities.
because of a lack of
Indeed, many manufacturers are now outsourcing all or some of their resources, time constraints
maintenance activities and so the contractor too is now under the or the pressure of everyday
same pressure to calibrate plant instruments quickly but accurately and
to ensure that the results are then documented for quality assurance activities.
purposes and to provide full traceability.
The purpose of calibration itself is to determine how accurate an
instrument or sensor is. Although most instruments are very accurate
these days, regulatory bodies often need to know just how inaccurate
a particular instrument is and whether it drifts in and out of specified
tolerance over time.
129
the benefits of using a documenting calibrator
the whole process is By using a documenting calibrator, the calibration results are stored
automatically in the calibrator’s memory during the calibration
much faster and costs process. The engineer does not have to write any results down on
are therefore reduced. paper and so the whole process is much faster and costs are therefore
reduced. Quality and accuracy of calibration results will also improve,
as there will be fewer mistakes due to human error.
The calibration results are transferred automatically from the
calibrator’s memory to the computer/database. This means the
engineer does not spend time transferring the results from his notepad
to final storage on a computer, again, saving time and money.
With instrument calibration, the calibration procedure itself is
critical. Performing the calibration procedure in the same way each
time is important for consistency of results. With a documenting
calibrator, the calibration procedure can be automatically transferred
from the computer to the handheld calibrator before going out into
the field.
As Heikki Laurila states: “Engineers who are out in the field
performing instrument calibrations, get instant pass or fail messages
with a documenting calibrator. The tolerances and limits for a sensor,
as well as detailed instructions on how to calibrate the transmitter,
are input once in the calibration management software and then
downloaded to the calibrator. This means calibrations are carried
out in the same way every time as the engineer is being told by the
calibrator which test point he needs to measure next. Also, having
130
the benefits of using a documenting calibrator
131
the benefits of using a documenting calibrator
132
the benefits of using a documenting calibrator
133
the benefits of using a documenting calibrator
SUMMARY
134
135
136
the safest way to calibrate
137
the safest way to calibrate
138
the safest way to calibrate
139
the safest way to calibrate
140
the safest way to calibrate
141
142
software for calibration management
Every plant has some sort of system in place for Calibration software
managing calibration operations and data, but the is one such tool that
different methods for doing it varies greatly in terms
can be used to support
of cost, quality, efficiency and accuracy of data.
and guide calibration
Introduction management activities,
Every manufacturing plant has some sort of system in place for with documentation being
managing instrument calibration operations and data. Plant a critical part of this.
instrumentation devices such as temperature sensors, pressure
transducers and weighing instruments – require regular calibration to
ensure they are performing and measuring to specified tolerances.
However, different companies from a diverse range of industry
sectors use very different methods of managing these calibrations.
These methods differ greatly in terms of cost, quality, efficiency, and
accuracy of data and their level of automation.
Calibration software is one such tool that can be used to support and
guide calibration management activities, with documentation being
a critical part of this.
But in order to understand how software can help process plants
better manage their instrument calibrations, it is important to consider
the typical calibration management tasks that companies have to
undertake. There are five main areas here, comprising of planning
and decision-making; organisation; execution; documentation; and
analysis.
Careful planning and decision-making is important. All plant
143
software for calibration management
Documentation
Documentation is a very important part of a calibration management
process. ISO 9001:2000 and the FDA both state that calibration
144
software for calibration management
145
software for calibration management
146
software for calibration management
Paper-based systems
These systems typically involve hand-written documents. Typically,
this might include engineers using pens and paper to record calibration
results while out in the field. On returning to the office, these notes
are then tidied up or transferred to another paper document, after
which they are archived as paper documents.
While using a manual, paper-based system requires little or no Regardless of industry
investment, it is very labour-intensive and means that historical trend sector, there seems to be
analysis becomes very difficult to carry out. In addition, the calibration
data is not easily accessible. The system is time consuming, soaks up a some general challenges
lot of resources and typing errors are commonplace. Dual effort and that companies face when
re-keying of calibration data are also significant costs here.
it comes to calibration
management.
In-house legacy systems (spreadsheets, databases, etc.)
Although certainly a step in the right direction, using an in-house
legacy system to manage calibrations has its drawbacks. In these
systems, calibration data is typically entered manually into a
spreadsheet or database. The data is stored in electronic format, but
the recording of calibration information is still time-consuming and
typing errors are common. Also, the calibration process itself cannot
be automated. For example, automatic alarms cannot be set up on
instruments that are due for calibration.
147
software for calibration management
Calibration Software
With specialist calibration management software, users are provided
with an easy-to-use Windows Explorer-like interface. The software
manages and stores all instrument and calibration data. This
includes the planning and scheduling of calibration work; analysis
and optimisation of calibration frequency; production of reports,
Using software for certificates and labels; communication with smart calibrators; and
calibration management easy integration with CMM systems such as SAP and Maximo. The
result is a streamlined, automated calibration process, which improves
enables faster, easier and quality, plant productivity and efficiency.
more accurate analysis
of calibration records and Benefits of Using Calibration Software
identifying historical trends. With software-based calibration management, planning and decision-
making are improved. Procedures and calibration strategies can be
planned and all calibration assets managed by the software. Position,
device and calibrator databases are maintained, while automatic alerts
for scheduled calibrations can be set up.
Organisation also improves. The system no longer requires pens and
paper. Calibration instructions are created using the software to guide
engineers through the calibration process. These instructions can also
be downloaded to a technician’s handheld documenting calibrator
while they are in the field.
Execution is more efficient and errors are eliminated. Using
software-based calibration management systems in conjunction with
documenting calibrators means that calibration results can be stored
in the calibrator’s memory, then automatically uploaded back to the
calibration software. There is no re-keying of calibration results from
a notebook to a database or spreadsheet. Human error is minimised
and engineers are freed up to perform more strategic analysis or other
important activities.
Documentation is also improved. The software generates reports
automatically and all calibration data is stored in one database rather
than multiple disparate systems. Calibration certificates, reports and
148
software for calibration management
149
software for calibration management
Summary
Every type of process plant, regardless of industry sector, can benefit
from implementing specialist calibration management software.
Compared to traditional, paper-based systems, in-house built legacy
calibration systems or calibration modules with CMM systems, using
dedicated calibration management software results in improved
quality, increased productivity and reduced costs of the entire
calibration process.
Despite these benefits, only one quarter of companies who need
to manage instrument calibrations actually use software designed for
that purpose.
150
151
appendix: calibration terminology a to z 1
Calibration terminology
1
A to Z
Introduction
T
his glossary is a quick reference to the meaning of common
terms. It is a supplement to the VIM, GUM, NCSL Glossary,
and the information in the other references listed at the end.
In technical, scientific and engineering work (such as metrology)
it is important to correctly use words that have a technical meaning.
Definitions of these words are in relevant national, international
and industry standards; journals; and other publications, as well as
publications of relevant technical and professional organizations.
Those documents give the intended meaning of the word, so everyone
in the business knows what it is. In technical work, only the technical
definitions should be used.
Many of these definitions are adapted from the references. In some
cases several may be merged to better clarify the meaning or adapt
the wording to common metrology usage. The technical definitions
may be different from the definitions published in common grammar
dictionaries. However, the purpose of common dictionaries is to record
the ways that people actually use words, not to standardize the way
the words should be used. If a word is defined in a technical standard, its
definition from a common grammar dictionary should never be used in work
where the technical standard can apply.
153
appendix: calibration terminology a to z 1
Terms that are not in this glossary may be found in one of these
primary references:
1. ISO. 1993. International vocabulary of basic and general terms in
metrology (called the VIM); BIPM, IEC, IFCC, ISO, IUPAC,
IUPAP, and OIML. Geneva: ISO.
2. ANSI/NCSL. 1997. ANSI/NCSL Z540-2-1997, U. S. Guide to the
expression of uncertainty in measurement (called the GUM). Boulder,
CO: NCSL International.
3. NCSL. 1999. NCSL Glossary of metrology-related terms. 2nd ed.
Boulder, CO: NCSL International.
Glossary
Accreditation (of a laboratory) – Formal recognition by an accreditation
body that a calibration or testing laboratory is able to competently
perform the calibrations or tests listed in the accreditation
scope document. Accreditation includes evaluation of both the
quality management system and the competence to perform the
measurements listed in the scope.
Accreditation body – An organization that conducts laboratory
accreditation evaluations in conformance to ISO Guide 58.
Accreditation certificate – Document issued by an accreditation
body to a laboratory that has met the conditions and criteria for
accreditation. The certificate, with the documented measurement
parameters and their best uncertainties, serves as proof of accredited
status for the time period listed. An accreditation certificate
without the documented parameters is incomplete.
Accreditation criteria – Set of requirements used by an accrediting
body that a laboratory must meet in order to be accredited.
Accuracy (of a measurement) – Accuracy is a qualitative indication of
how closely the result of a measurement agrees with the true value
of the parameter being measured. (VIM, 3.5) Because the true
value is always unknown, accuracy of a measurement is always
an estimate. An accuracy statement by itself has no meaning
154
appendix: calibration terminology a to z 1
155
appendix: calibration terminology a to z 1
Notes:
• A requirement for calibration does not imply that the item
being calibrated can or should be adjusted.
• The calibration process may include, if necessary, calculation
of correction factors or adjustment of the instrument being
compared to reduce the magnitude of the inaccuracy.
• In some cases, minor repair such as replacement of batteries,
fuses, or lamps, or minor adjustment such as zero and span,
may be included as part of the calibration.
• Calibration does not include any maintenance or repair actions
except as just noted. See also: performance test, calibration
procedure Contrast with: calibration (2) and repair
156
appendix: calibration terminology a to z 1
157
appendix: calibration terminology a to z 1
158
appendix: calibration terminology a to z 1
s p (1 – p)
CI = x̄ ± t = ––– or CI = p ± –––––––––
n n
where
CI is the confidence interval,
n is the number of items in the sample,
p is the proportion of items of a given type in the population,
s is the sample standard deviation,
x is the sample mean, and
t is the Student’s T value for α ⁄2 and (n – 1) (α is the level of
significance).
159
appendix: calibration terminology a to z 1
160
appendix: calibration terminology a to z 1
161
appendix: calibration terminology a to z 1
162
appendix: calibration terminology a to z 1
163
appendix: calibration terminology a to z 1
164
appendix: calibration terminology a to z 1
165
appendix: calibration terminology a to z 1
166
appendix: calibration terminology a to z 1
167
appendix: calibration terminology a to z 1
defined by NIST “is a material or artifact that has had one or more
of its property values certified by a technically valid procedure,
and is accompanied by, or traceable to, a certificate or other
documentation which is issued by NIST… Standard reference
materials are…manufactured according to strict specifications
and certified by NIST for one or more quantities of interest.
SRMs represent one of the primary vehicles for disseminating
measurement technology to industry.”
Standard uncertainty – The uncertainty of the result of a measurement,
expressed as a standard deviation. (GUM, 2.3.1)
Standardization – See: self-calibration.
Systematic error – A systematic error is the mean of a large number of
measurements of the same value minus the (probable) true value
of the measured parameter. (VIM, 3.14) Systematic error causes the
average of the readings to be offset from the true value. Systematic
error is a measure of magnitude and may be corrected. Systematic
error is also called bias when it applies to a measuring instrument.
Systematic error may be evaluated by Type A or Type B methods,
according to the type of data available. Note: Contrary to popular
belief, the GUM specifically does not replace systematic error with
either Type A or Type B methods of evaluation. (3.2.3, note) See
also: bias, error, correction (of error) Compare with: random error
Test accuracy ratio – (1) In a calibration procedure, the test accuracy
ratio (TAR) is the ratio of the accuracy tolerance of the unit under
calibration to the accuracy tolerance of the calibration standard
used. (NCSL, page 2)
TAR = UUT_tolerance
STD_tolerance
168
appendix: calibration terminology a to z 1
TUR = UUT_tolerance
STD_uncert
169
appendix: calibration terminology a to z 1
170
appendix: calibration terminology a to z 1
171
PORTABLE CALIBRATORS
CALIBRATION SOFTWARE
172
about beamex
About Beamex
• One of the world’s leading providers of calibration Why is Beamex better
solutions.
• Develops and manufactures high-quality Accuracy assured
calibration equipment, software, systems and Accuracy is assured when you
services for the calibration and maintenance of decide to purchase a Beamex®
process instruments. calibrator. They are all delivered with
• Certified in accordance with the ISO 9001:2000 a traceable, accredited calibration
quality standard. certificate.
• Comprehensive product range includes portable
calibrators, workstations, calibration software, Integrated calibration solutions
accessories, professional services and industry- Beamex calibrators, workstations,
specific solutions. calibration software and professional
• Products and services available in more than services form an integrated,
60 countries. More than 10,000 companies automated system.
worldwide utilize Beamex’s calibration solutions.
• Customers from wide range of industries, such Industry pioneer with global
as automotive, aviation, contractor engineering, presence
education, food and beverage, manufacturing, A forerunner in developing high-
marine, metal and mining, nuclear, oil and gas, quality calibration equipment and
petrochemical and chemical, pharmaceutical, software, with global customer base
power and energy, and pulp and paper. and partner network.
• For customers with requirements for accuracy,
versatility, efficiency, ease-of-use and reliability. High customer satisfaction
• Beamex’s Accredited Calibration Laboratory Constantly improving understanding
is accredited and approved by FINAS (Finnish of customer needs and developing
Accreditation Service). FINAS is a member of all solutions to meet them.
Multilateral Recognition Agreements / Mutual
Recognition Arrangements (MLA/MRA) signed by Support
European and other international organizations, Installation, training, validation,
i.e. European co-operation for Accreditation system integration, database
(EA), International Laboratory Accreditation conversion, Help Desk and re-
Cooperation (ILAC) and International calibration services available.
Accreditation Forum Inc. (IAF).
173
w w w. b eam ex. c om