You are on page 1of 184

Ultimate

Calibration
Ultimate Calibration
Ultimate Calibration
Beamex is a technology and service company that develops, manufactures and
markets high-quality calibration equipment, software, systems and services for the
calibration and maintenance of process instruments. The company is a leading
worldwide provider of integrated calibration solutions that meet even the most demanding
requirements. Beamex offers a comprehensive range of products and services-from portable
calibrators to workstations, calibration accessories, calibration software, industry-specific
solutions and professional services. Through Beamex’s global and competent partner network,
their products and services are available in more than 60 countries. As a proof of Beamex’s
success, there are more than 10,000 companies worldwide utilizing their calibration solutions.
Several companies have been Beamex’s customer since the establishment of the company
over 30 years ago. For more information about Beamex and its products and services,
visit www.beamex.com

Ultimate Calibration
Copyright © 2009 by Beamex Oy Ab. All rights reserved.
No part of this publication may be reproduced or distributed in
any form or by any means, or stored in a database or retrieval
system, without the prior written permission of Beamex Oy Ab.
Requests should be directed to info@beamex.com.

Graphic design: Studio PAP


Photos: Mats Sandström and image bank

Printed by: Fram in Vaasa, Finland


Contents

Acknowledgements  6
Preface by the CEO of Beamex Group  7

PART 1
Quality standards and industry regulations  11
A basic quality calibration program  27

PART 2
Traceable and efficient calibrations in the process industry  51
How often should instruments be calibrated  65
How often should calibrators be calibrated  73
Automated calibration planning lowers costs  77
Benefits of integrating calibration software with CMMS  83
Fieldbus transmitters must also be calibrated  91
Calibration of weighing instruments Part 1  97
Calibration of weighing instruments Part 2  103
Calibration in hazardous environments  109
Improving power plant performance through calibration  115
Calibration in the pharmaceutical industry  121
The benefits of using a documenting calibrator  129
The safest way to calibrate  137
Why use software for calibration management?  143

APPENDIX: Calibration terminology A to Z  153


foreword

Acknowledgements

T
his book is the result of work
that has taken place between
2006 and 2009. A team of
experts in industry and calibration
worldwide has put forth effort to
the creation of this book by writing
articles related to calibration.
On behalf of Beamex, I would
like to thank all of the people who
have contributed to the creation
of this book:
Jay Bucher, Mike Farkas, Heikki
Laurila, Osmo Luotsinen, Pertti
Mäki, Dean Palmer, Aimo Pusa,
David Spitzer and Vance VanDoren.
Additionally, I would like to
thank the team at Beamex, which
has made a huge contribution
to developing the concept and
planning the content of the book,
overseeing the graphic design,
providing technical know-how for
the writers, proof-reading the texts
as well as providing photographs
to illustrate the book. Without the
support of these people, this book
could not have been published.
Thank you!

villy lindfelt
marketing director
beamex group

6
preface by the ceo of beamex group

Preface

C
alibrators, calibration software and other related ‘tools’ have
developed significantly during the past few decades in spite of
the fact that the calibration of measurement devices as such
has existed for several thousands of years.
Presently, the primary challenges of industrial metrology and
calibration include how to simplify and streamline the entire
calibration process, how to eliminate double work, how to reduce
production down-time, and how to lower the risk of human errors.
These all can be tackled by improving the level of system integration
and automation.
Calibration and calibrators can no longer considered as isolated,
stand-alone devices, systems or work processes within a company or
a production plant. Just like any other business function, calibration
procedures need to be automated to a higher degree and integrated to
achieve improvements in quality and efficiency. This is the area where
Beamex as a company aims to be the benchmark in the industry.
This book, Ultimate Calibration, has two main purposes: firstly,
to give some general information about industrial calibration, and
secondly to give a deeper introduction to the ‘world of integrated
and automated calibration’ in the form of independent articles. The
content of the book has been selected and organized in such a way
that as many readers as possible are able to find something useful and
interesting in it.
This book has several professional contributors as stated in the
Acknowledgements but here I want to express my special thanks to
Villy, our Marketing Director, whose original idea the book is and
who was the director of the project. I hope this book will assist you
in learning new things and provide you with fresh ideas. Enjoy your
reading!

raimo ahola
ceo
beamex group

7
PART 1
quality standards and industry regulations

What quality standards


and industry regulations say
about calibration requirements

Introduction

B
efore going into what the current standards and regulations
actually state, here is a reminder from times past about
measurement practices and how important they really are.
Immersion in water makes the straight seem bent; but reason, thus
confused by false appearance, is beautifully restored by measuring,
numbering and weighing; these drive vague notions of greater or less
or more or heavier right out of the minds of the surveyor, the computer,
and the clerk of the scales. Surely it is the better part of thought that
relies on measurement and calculation. (Plato, The Republic, 360 B.C.)
There shall be standard measures of wine, beer, and corn…
throughout the whole of our kingdom, and a standard width of dyed
russet and cloth; and there shall be standard weights also. (Clause 35,
Magna Carta, 1215)
When you can measure what you are speaking about and express
it in numbers, you know something about it; but when you cannot
express it in numbers, your knowledge is of a meager and unsatisfactory
kind. It may be the beginning of knowledge, but you have scarcely, in
your thoughts, advanced to the stage of science. (William Thomson,
1st Baron Kelvin, GCVO, OM, PC, PRS, 26 June 1824–17 December
1907; A.K.A. Lord Kelvin).1
One of the earliest records of precise measurement is from Egypt.
The Egyptians studied the science of geometry to assist them in the
construction of the Pyramids. It is believed that about 3000 years B.C.,
the Egyptian unit of length came into being.

11
quality standards and industry regulations

The “Royal Egyptian Cubit” was decreed to be equal to the length


of the forearm from the bent elbow to the tip of the extended middle
finger plus the width of the palm of the hand of the Pharaoh or King
ruling at that time.2
The “Royal Cubit Master” was carved out of a block of granite to
endure for all times. Workers engaged in building tombs, temples,
pyramids, etc. were supplied with cubits made of wood or granite. The
Royal Architect or Foreman of the construction site was responsible for
maintaining & transferring the unit of length to workers instruments.
They were required to bring back their cubit sticks at each full moon
to be compared to the Royal Cubit Master.
Failure to do so was punishable by death. Though the punishment
prescribed was severe, the Egyptians had anticipated the spirit of the
present day system of legal metrology, standards, traceability and
calibration recall.
With this standardization and uniformity of length, the Egyptians
achieved surprising accuracy. Thousands of workers were engaged in
building the Great Pyramid of Giza. Through the use of cubit sticks,
they achieved an accuracy of 0.05%. In roughly 756 feet or 230.36276
meters, they were within 11.43 centimeters.
The need for calibration has been around for at least 5000 years.
In today’s calibration environment, there are basically two types of
requirements: ISO standards and regulatory requirements. The biggest
difference between the two is simple – ISO standards are voluntary, and
regulatory requirements are mandatory. If an organization volunteers
to meet ISO 9000 standards, they pay a company to audit them to that
standard to ensure they are following their quality manual and are
within compliance. On the other hand, if a company is manufacturing
a drug that must meet regulatory requirements, they are inspected
by government inspectors for compliance to federal regulations. In
the case of ISO standards, a set of guidelines are used to write their
quality manual and other standard operating procedures (SOPs)
and they show how they comply with the standard. However, the
federal regulations specify in greater detail what a company must do
to meet the requirements set forth in the Code of Federal Regulations
(CFRs).

12
quality standards and industry regulations

Calibration requirements according to the U. S. Food and Drug


Administration (FDA)
Following are examples of some of the regulations required by the FDA,
and what they say about calibration and what must be accomplished to
meet the CFRs. Please note that European standards are very similar to
FDA requirements. Listed below are several different parts of 21CFR
that relate to the calibration of test equipment in different situations
and environments.

TITLE 21 — FOOD AND DRUGS


PART 820 – QUALITY SYSTEM REGULATION (QSR)
(as of 1 April, 2008 – online June 11, 2008)

The QSR

820.1 Scope. 820.86 Acceptance status.


820.3 Definitions. 820.90 Nonconforming
820.5 Quality system. product.
820.20 Management 820.100 Corrective and
responsibility. preventive action.
820.22 Quality audit. 820.120 Device labeling.
820.25 Personnel. 820.130 Device packaging.
820.30 Design controls. 820.140 Handling.
820.40 Document controls. 820.150 Storage.
820.50 Purchasing controls. 820.160 Distribution.
820.60 Identification. 820.170 Installation.
820.65 Traceability. 820.180 General requirements.
820.70 Production and 820.181 Device master record.
process controls. 820.184 Device history record.
820.72 Inspection, 820.186 Quality system record.
measuring, and 820.198 Complaint files.
test equipment. 820.200 Servicing.
820.75 Process validation. 820.250 Statistical techniques.
820.80 Receiving,
in-process, and
finished device
acceptance.

13
quality standards and industry regulations

The part covering calibration is 21CFR Part 820.72. It is very specific


about the requirements as seen in the following paragraphs:

Sec. 820.72 Inspection, measuring, and test equipment.

(a) Control of inspection, measuring, and test equipment. Each


manufacturer shall ensure that all inspection, measuring, and
test equipment, including mechanical, automated, or electronic
inspection and test equipment, are suitable for their intended
purposes and capable of producing valid results. Each manufacturer
shall establish and maintain procedures to ensure that equipment
is routinely calibrated, inspected, checked, and maintained. The
procedures shall include provisions for handling, preservation, and
storage of equipment, so that its accuracy and fitness for use are
maintained. These activities shall be documented.

(b) Calibration. Calibration procedures shall include specific


directions and limits for accuracy and precision. When accuracy
and precision limits are not met, there shall be provisions for
remedial action to reestablish the limits and to evaluate whether
there was any adverse effect on the device’s quality. These activities
shall be documented.
(1) Calibration standards. Calibration standards used for
inspection, measuring, and test equipment shall be traceable to
national or international standards. If national or international
standards are not practical or available, the manufacturer shall
use an independent reproducible standard. If no applicable
standard exists, the manufacturer shall establish and maintain
an in-house standard.
(2) Calibration records. The equipment identification, calibration
dates, the individual performing each calibration, and the
next calibration date shall be documented. These records shall
be displayed on or near each piece of equipment or shall be
readily available to the personnel using such equipment and to
the individuals responsible for calibrating the equipment.

Please note that each paragraph (a, b, 1 and 2) all have requirements
for documenting everything. In the FDA environment, if there
is no documentation for an action, then the action did not occur.
Documented evidence is critical to showing that something happened,

14
quality standards and industry regulations

when it happened, what occurred using each piece of equipment, and


who did the action on what date. There cannot be enough stress placed
on documenting everything in an FDA regulated company. This
includes training, updating of SOPs and calibration procedures, and
the repair and replacement of parts in test equipment. All aspects of a
quality calibration program must be documented for several reasons,
the least of which is not having to reinvent the wheel every time a
process is improved or updated.
The following are the requirements for cGMP (current Good
Manufacturing Practice) under the FDA.

TITLE 21 — FOOD AND DRUGS


PART 211 – CURRENT GOOD MANUFACTURING PRACTICE
FOR FINISHED PHARMACEUTICALS
Subpart D – Equipment

Sec. 211.68 Automatic, mechanical, and electronic equipment.

(a) Automatic, mechanical, or electronic equipment or other types


of equipment, including computers, or related systems that will
perform a function satisfactorily, may be used in the manufacture,
processing, packing, and holding of a drug product. If such
equipment is so used, it shall be routinely calibrated, inspected, or
checked according to a written program designed to assure proper
performance. Written records of those calibration checks and
inspections shall be maintained.3

TITLE 21 — FOOD AND DRUGS


PART 211 – CURRENT GOOD MANUFACTURING PRACTICE
FOR FINISHED PHARMACEUTICALS
Subpart I – Laboratory Controls

Sec. 211.160 General requirements.

(a) The establishment of any specifications, standards, sampling


plans, test procedures, or other laboratory control mechanisms
required by this subpart, including any change in such
specifications, standards, sampling plans, test procedures, or
other laboratory control mechanisms, shall be drafted by the
appropriate organizational unit and reviewed and approved by

15
quality standards and industry regulations

the quality control unit. The requirements in this subpart shall


be followed and shall be documented at the time of performance.
Any deviation from the written specifications, standards, sampling
plans, test procedures, or other laboratory control mechanisms
shall be recorded and justified.

(b) Laboratory controls shall include the establishment of scientifically


sound and appropriate specifications, standards, sampling plans,
and test procedures designed to assure that components, drug
product containers, closures, in-process materials, labeling, and
drug products conform to appropriate standards of identity,
strength, quality, and purity. Laboratory controls shall include:

(4) The calibration of instruments, apparatus, gauges, and recording


devices at suitable intervals in accordance with an established
written program containing specific directions, schedules,
limits for accuracy and precision, and provisions for remedial
action in the event accuracy and/or precision limits are not met.
Instruments, apparatus, gauges, and recording devices not meeting
established specifications shall not be used.4

TITLE 21 — FOOD AND DRUGS


PART 11 – ELECTRONIC RECORDS; ELECTRONIC
SIGNATURES
Subpart A – General Provisions

Sec. 11.1 Scope.

(a) The regulations in this part set forth the criteria under which
the agency considers electronic records, electronic signatures,
and handwritten signatures executed to electronic records to be
trustworthy, reliable, and generally equivalent to paper records
and handwritten signatures executed on paper.

(b) This part applies to records in electronic form that are created,


modified, maintained, archived, retrieved, or transmitted, under
any records requirements set forth in agency regulations. This
part also applies to electronic records submitted to the agency
under requirements of the Federal Food, Drug, and Cosmetic Act
and the Public Health Service Act, even if such records are not

16
quality standards and industry regulations

specifically identified in agency regulations. However, this part


does not apply to paper records that are, or have been, transmitted
by electronic means.

(c) Where electronic signatures and their associated electronic records


meet the requirements of this part, the agency will consider
the electronic signatures to be equivalent to full handwritten
signatures, initials, and other general signings as required by
agency regulations, unless specifically excepted by regulation(s)
effective on or after August 20, 1997.

(d) Electronic records that meet the requirements of this part may be


used in lieu of paper records, in accordance with Sec. 11.2, unless
paper records are specifically required.

(e) Computer systems (including hardware and software), controls,


and attendant documentation maintained under this part shall be
readily available for, and subject to, FDA inspection.

(f) This part does not apply to records required to be established or


maintained by Sec. 1.326 through 1.368 of this chapter. Records
that satisfy the requirements of part 1, subpart J of this chapter, but
that also are required under other applicable statutory provisions
or regulations, remain subject to this part.

Sec. 11.2 Implementation.

(a) For records required to be maintained but not submitted to the


agency, persons may use electronic records in lieu of paper records
or electronic signatures in lieu of traditional signatures, in whole
or in part, provided that the requirements of this part are met. (b)
For records submitted to the agency, persons may use electronic
records in lieu of paper records or electronic signatures in lieu of
traditional signatures, in whole or in part, provided that:
(1)  The requirements of this part are met; and
(2) The document or parts of a document to be submitted have
been identified in public docket No. 92S-0251 as being the
type of submission the agency accepts in electronic form. This
docket will identify specifically what types of documents or
parts of documents are acceptable for submission in electronic

17
quality standards and industry regulations

form without paper records and the agency receiving unit(s)


(e.g., specific center, office, division, branch) to which such
submissions may be made. Documents to agency receiving
unit(s) not specified in the public docket will not be considered
as official if they are submitted in electronic form; paper
forms of such documents will be considered as official and
must accompany any electronic records. Persons are expected
to consult with the intended agency receiving unit for details
on how (e.g., method of transmission, media, file formats,
and technical protocols) and whether to proceed with the
electronic submission.

Sec. 11.100 General requirements.

(a) Each electronic signature shall be unique to one individual and


shall not be reused by, or reassigned to, anyone else.

(b) Before an organization establishes, assigns, certifies, or otherwise


sanctions an individual’s electronic signature, or any element of
such electronic signature, the organization shall verify the identity
of the individual.

(c) Persons using electronic signatures shall, prior to or at the time


of such use, certify to the agency that the electronic signatures
in their system, used on or after August 20, 1997, are intended
to be the legally binding equivalent of traditional handwritten
signatures.
(1) The certification shall be submitted in paper form and signed
with a traditional handwritten signature, to the Office of
Regional Operations (HFC-100), 5600 Fishers Lane, Rockville,
MD 20857.
(2) Persons using electronic signatures shall, upon agency request,
provide additional certification or testimony that a specific
electronic signature is the legally binding equivalent of the
signer’s handwritten signature.5

18
quality standards and industry regulations

GAMP
Good Automated Manufacturing Practice (GAMP) is a technical
sub-committee, known as a COP (Community Of Practice) of the
International Society for Pharmaceutical Engineering (ISPE). The goal
of the community is to promote the understanding of the regulation
and use of automated systems within the pharmaceutical industry.
The GAMP COP organizes discussion forums for its members. ISPE
organises GAMP related training courses and educational seminars.
Several local GAMP COPs, such as GAMP Americas, GAMP Nordic,
GAMP DACH (Germany, Austria, Switzerland), GAMP Francophone,
GAMP Italiano and GAMP Japan bring the GAMP community closer
to its members, in collaboration with ISPE’s local Affiliates in these
regions.
The GAMP publishes a series of Good Practice Guides for its
members on several topics involved in drug manufacturing. The most
well-known is the The Good Automated Manufacturing Practice (GAMP)
Guide for Validation of Automated Systems in Pharmaceutical Manufacture.
The last major revision (GAMP5) was released in January 2008.

Other guidelines are:

•  G
 AMP Good Practice Guide: A Risk-Based Approach to Compliant
Electronic Records and Signatures
•  G
 AMP Good Practice Guide: Calibration Management
•  G
 AMP Good Practice Guide: Electronic Data Archiving
•  G
 AMP Good Practice Guide: Global Information Systems Control and
Compliance
•  G
 AMP Good Practice Guide: IT Infrastructure Control and Compliance
•  G
 AMP Good Practice Guide: Testing of GxP Systems
•  G
 AMP Good Practice Guide: Validation of Laboratory Computerized
Systems
•  G
 AMP Good Practice Guide: Validation of Process Control Systems

GAMP itself was founded in 1991 in the United Kingdom to deal


with the evolving FDA expectations for Good Manufacturing Practice
(GMP) compliance of manufacturing and related systems. Since
1994, the organization entered into a partnership with the ISPE and
published its first GAMP guidelines.6
The GAMP Good Practice Guide: Calibration Management is the

19
quality standards and industry regulations

first in a series of GAMP guidance on specific topics. Developed


by the Calibration Management Special Interest Group (SIG) of
ISPE’s GAMP Forum in conjunction with representatives from
the pharmaceutical industry and regulators, the Guide describes
the principles of calibration and presents guidance in setting up a
calibration management system, providing the structured approach
required to establish formal criticality assessment, management,
documentation, and corrective actions vital to regulatory compliance.
Featured Topics – The Guide is structured to follow the life cycle of
validation and is intended to cover initial calibration in addition to:

•  periodic inspection, testing, and calibration


•  laboratory instrumentation
•  quality
•  safety
•  environmental issues
•  regulatory requirements7

ISO 9001:2008
Basically, this is what is required according to ISO 9001:2008

7.6 CONTROL MONITORING AND MEASURING


EQUIPMENT

•  Identify your organization’s monitoring and measuring needs


and requirements (if your test instrument makes a quantitative
measurement, it requires periodic calibration); and select test
equipment that can meet those monitoring and measuring needs
and requirements. 
•  E stablish monitoring and measuring processes (calibration
procedures and calibration record templates for recording your
calibration results). 
•  Calibrate your monitoring and measuring equipment using a period
schedule to ensure that results are valid (you should also perform
a yearly evaluation of your calibration results to see if there is a
need to increase or decrease your calibration intervals on calibrated
test equipment). All calibrations must be traceable to a national or
international standard or artifact. 
•  Protect your monitoring and measuring equipment (this includes

20
quality standards and industry regulations

during handling, preservation, storage, transportation, and shipping


of all test instruments – to include your customer’s items, and your
calibration standards). 
•  Confirm that monitoring and measuring software is capable of
doing the job you want it to do (your software needs to be validated
before being used, and when required, your test instruments may
need to be qualified prior to use). 
•  Evaluate the validity of previous measurements whenever you
discover that your measuring or monitoring equipment is out-of-
calibration (as stated in the FDA regulations, “When accuracy and
precision limits are not met, there shall be provisions for remedial
action to reestablish the limits and to evaluate whether there was
any adverse effect on the device’s quality”; this is just as applicable
when dealing with ISO as with any other standard or regulation;
especially when the out of tolerance item is a calibration standard,
and may have affected numerous items of test equipment over a
period of time).

ISO 17025
ISO 17025 – General requirements for the competence of testing and calibration
laboratories. According to ISO 17025, this standard is applicable to all
organizations performing tests and/or calibrations. These include first-,
second-, and third-party laboratories, and laboratories where testing
and/or calibration forms part of inspection and product certifications.
Please keep in mind that if your calibration function and/or metrology
department fall under the requirements of your company, rather it be
for compliance to an ISO standard (ISO 9001:2000 or ISO 13485) or
an FDA requirement (cGMP, QSR, etc.), then you do not have any
obligation to meet the ISO 17025 standard. You already fall under a
quality system that takes care of your calibration requirements.

ANSI/NCSL Z540.3-2006
ANSI/NCSL Z540.3-2006 – American National Standard for Calibration-
Requirements for the Calibration of Measuring and Test Equipment.
The objective of this National Standard is to establish the technical
requirements for the calibration of measuring and test equipment.
This is done through the use of a system of functional components.
Collectively, these components are used to manage and assure that

21
quality standards and industry regulations

the accuracy and reliability of the measuring and test equipment are
in accordance with identified performance requirements.
In implementing its objective, this National Standard describes the
technical requirements for establishing and maintaining:
•  t he acceptability of the performance of measuring and test
equipment;
•  the suitability of a calibration for its intended application;
•  the compatibility of measurements with the National Measurement
System; and
•  the traceability of measurement results to the International System
of Units (SI).
In the development of this National Standard attention has been
given to:
•  expressing the technical requirements for a calibration system
supporting both government and industry needs;
•  applying best practices and experience with related national,
international, industry, and government standards; and
•  balancing the needs and interests of all stakeholders.
In addition, this National Standard includes and updates the relevant
calibration system requirements for measuring and test equipment
described by the previous standards, Part 11 of ANSI/NCSL Z540.1
(R2002) and Military Standard 45662A.
This National Standard is written for both Supplier and Customer,
each term being interpreted in the broadest sense. The “Supplier may
be a producer, distributor, vendor, or a provider of a product, service,
or information. The “Customer” may be a consumer, client, enduser,
retailer, or purchaser that receives a product or service.
Reference to this National Standard may be made by:
•  customers when specifying products (including services) required;
•  suppliers when specifying products offered;
•  legislative or regulatory bodies;
•  agencies or organizations as a contractual condition for procurement;
and
•  assessment organizations in the audit, certification, and other
evaluations of calibration systems and their components.
This National Standard is specific to calibration systems. A
calibration system operating in full compliance with this National
Standard promotes confidence and facilitates management of the risks
associated with measurements, tests, and calibrations.8

22
quality standards and industry regulations

Equipment intended for use in potentially explosive atmospheres


(ATEX)
What are ATEX and IECEx?

ATEX (“ATmosphères EXplosibles”, explosive atmospheres in French)


is a standard set in the European Union for explosion protection in the
industry. ATEX 95 equipment directive 94/9/EC concerns equipment
intended for use in potentially explosive areas. Companies in the
EU where the risk of explosion is evident must also use the ATEX
guidelines for protecting the employees. In addition, the ATEX rules
are obligatory for electronic and electrical equipment that will be
used in potentially explosive atmospheres sold in the EU as of July
1, 2003.
IEC (International Electrotechnical Commission) is a nonprofit
international standards organization that prepares and publishes
International Standards for electrical technologies. The IEC TC/31
technical committee deals with the standards related to equipment for
explosive atmospheres. IECEx is an international scheme for certifying
procedures for equipment designed for use in explosive atmospheres.
The objective of the IECEx Scheme is to facilitate international trade
in equipment and services for use in explosive atmospheres, while
maintaining the required level of safety.
In most cases, test equipment that is required to be operated in
an explosive environment would be qualified and installed by the
company’s facility services department and not the calibration
personnel. One must also keep in mind that there would be two
different avenues for the calibration of those pieces of test equipment:
on-site and off-site. If the test instrument that is used in an explosive
environment must be calibrated on-site (in the explosive environment),
then all the standards used for that calibration must also comply
with explosive environment directives. However, if it were possible
to remove the test equipment from the explosive environment when
due for their period calibration, then there would be no requirement
for the standards used for their calibration to meet the explosive
environment directives, saving money on expensive standards and
possibly expensive training of calibration personnel in order for them
to work in those conditions.
Having said that, there may be a need for the calibration personnel
to be aware of the ATEX regulations. An informative website for

23
quality standards and industry regulations

information on ATEX can be found by typing in the following link:


http://ec.europa.eu/enterprise/atex/indexinfor.htm. Several languages
are available for retrieving the information.
Another informative website is the International Electrotechnical
Commission Scheme for Certification to Standards Relating to
Equipment for use in Explosive Atmospheres (IECEx Scheme). The
link is: http://www.iecex.com/guides.htm.

1. Bucher, Jay L. 2007. The Quality Calibration Handbook.


Milwaukee: ASQ Quality Press.
2. The Story of the Egyptian Cubit. http://www.ncsli.org/misc/cubit.
cfm. (18 October, 2008)
3. 21CFR Part 211. http://www.accessdata.fda.gov/scripts/cdrh/
cfdocs/cfcfr/CFRSearch.cfm?fr=211.68. (29 October, 2008)
4. 21CFR Part 211. http://www.accessdata.fda.gov/scripts/cdrh/
cfdocs/cfcfr/CFRSearch.cfm?fr=211.160 (29 October, 2008)
5. 21CFR Part 11. http://w w w.access.gpo.gov/nara/cfr/
waisidx_08/21cfr11_08.html (5 December, 2008)
6. GAMP. http://en.wikipedia.org/wiki/Good_ Automated_
Manufacturing_Practice (5 December, 2008)
7. Techstreet. http://www.techstreet.com/cgi-bin/detail?product_
id=1224147 (5 December, 2008)
8. NCSL International. 2006. ANSI/NCSL Z540.3-2006. Boulder,
CO.

24
26
a basic quality calibration program

A basic quality
calibration program

Why do we calibrate?

R
&D departments are tasked with coming up with the answers
to many problems; the cure for cancer is one of them. Let’s 
  imagine that the Acme Biotech Co. has found the cure for
cancer. Their R&D section sends the formula to their operations &
manufacturing division. The cure cannot be replicated with consistent
results. They are not using calibrated test instruments in the company.
Measurements made by R&D are different than those made by the
operations section. If all test equipment were calibrated to a traceable
standard, then repeatable results would ensure that what’s made in one
part of the company is also repeated in another part of the company.
The company loses time, money, their reputation, and possibly the
ability to stay in business simply because they do not use calibrated
test equipment. A fairy tale? Not hardly. This scenario is repeated
every day throughout the world.
Without calibration, or by using incorrect calibrations, all of us pay
more at the gas pump, for food weighed incorrectly at the checkout
counter, and for manufactured goods that do not meet their stated
specifications. Incorrect amounts of ingredients in your prescription
and over-the-counter (OTC) drugs can cost more, or even cause
illness or death. Because of poor or incorrect calibration, criminals
are either not convicted or are released on bad evidence. Crime labs
cannot identify the remains of victims or wrongly identify victims
in the case of mass graves. Airliners fly into mountaintops and off
the ends of runways because they don’t know their altitude and/or

27
a basic quality calibration program

speed. Babies are not correctly weighed at birth. The amount of drugs
confiscated in a raid determines whether the offense is a misdemeanor
or a felony; which weight is correct? As one can see, having the correct
measurements throughout any and all industries is critical to national
and international trade and commerce.
The bottom line is this – all test equipment that make a quantitative
measurement require periodic calibration. It is as simple as that.
However, before we go any further, we need to clarify two definitions
that are critical to this subject – calibration and traceability.

By definition:

Calibration is a comparison of two measurement devices or systems,


one of known uncertainty (your standard) and one of unknown
uncertainty (your test equipment or instrument).
Traceability is the property of the result of a measurement or the value
of a standard whereby it can be related to stated references, usually
national or international standards, through an unbroken chain of
calibrations all having stated uncertainties.

The calibration of any piece of equipment or system is simply


a comparison between the standard being used (with its known
uncertainty), and the unit under test (UUT) or test instrument that
is being calibrated (the uncertainty is unknown, and that is why it is
being calibrated). It does not make any difference if you adjust, align or
repair the item, nor if you cannot adjust or align it. The comparison to
a standard that is more accurate, no matter the circumstances is called
calibration. Many people are under the misconception that an item
must be adjusted or aligned in order to be calibrated. Nothing could
be further from the truth.
Before we can get any deeper into what traceability is, we should
explain two different traceability pyramids. When we talk about
traceability to a national or international standard, the ‘everyday
calibration technician’ is usually situated close to the bottom of the
pyramid, so a graphic illustration of these pyramids is important.
The two examples in figures 1 and 2 are similar, but differ depending
on where you are in the chain, or certain parts of the world.

28
a basic quality calibration program

BIPM

NMIs

Reference standards

Working metrology labs

General purpose calibration labs


(inside a company)

User’s test equipment

Figure 1

SI units

Primary stds.

Secondary standards

Reference standards

Working standards

User’s test equipment

Figure 2

Note: NMI = National Metrology Institute

There are basically two ways to maintain traceability during


calibration – the use of an uncertainty budget (performing uncertainty
calculations for each measurement); and using a TUR of ≥ 4:1. First,
let’s discuss the use of uncertainty budgets.
According to the European cooperation for Accreditation of
Laboratories, publication reference (EAL-G12) Traceability of Measuring
and Test Equipment to National Standards; the purpose of which is to
give guidance on the calibration and maintenance of measuring
equipment in meeting the requirements of the ISO 9000 series of
standards for quality systems, and the EN 45001 standard for the
operation of testing laboratories; paragraphs 4 and 5 are very specific
in their requirements:

29
a basic quality calibration program

4  Why are calibrations and traceability necessary?


4.1 Traceability of measuring and test equipment to national standards
by means of calibration is necessitated by the growing national and
international demand that manufactured parts be interchangeable;
supplier firms that make products, and customers who install them
with other parts, must measure with the ‘same measure’.
4.2 There are legal as well as technical reasons for traceability of
measurement. Relevant laws and regulations have to be complied
with just as much as the contractual provisions agreed with the
purchaser of the product (guarantee of product quality) and the
obligation to put into circulation only products whose safety, if
they are used properly, is not affected by defects.
Note: If binding requirements for the accuracy of measuring
and test equipment have been stipulated, failure to meet these
requirements means the absence of a warranted quality with
considerable consequent liability.
4.3 If it becomes necessary to prove absence of liability, the producer
must be able to demonstrate, by reference to a systematic and fully
documented system, that adequate measuring and test equipment
was chosen, was in proper working order and was used correctly
for controlling a product.
4.4 There are similar technical and legal reasons why calibration and
testing laboratory operators should have consistent control of
measuring and test equipment in the manner described.

5  Elements of traceability
5.1 Traceability is characterised by a number of essential elements:
(a) an unbroken chain of comparisons going back to a standard
acceptable to the parties, usually a national or international
standard;
(b) measurement uncertainty; the measurement uncertainty
for each step in the traceability chain must be calculated
according to agreed methods and must be stated so that an
overall uncertainty for the whole chain may be calculated;
(c) documentation; each step in the chain must be performed
according to documented and generally acknowledged
procedures; the results must equally be documented;
(d) competence; the laboratories or bodies performing one or

30
a basic quality calibration program

more steps in the chain must supply evidence for their technical
competence, e.g. by demonstrating that they are accredited;
(e) reference to SI units; the chain of comparisons must end at
primary standards for the realization of the SI units;
(f) re-calibrations; calibrations must be repeated at appropriate
intervals; the length of these intervals will depend on a number
of variables, e.g. uncertainty required, frequency of use, way of
use, stability of the equipment.
5.2 In many fields, reference materials take the position of physical
reference standards. It is equally important that such reference
materials are traceable to relevant SI units. Certification of
reference materials is a method that is often used to demonstrate
traceability to SI units.1
The other document that goes hand-in-hand with this is EA
4/02, Expression of the Uncertainty of Measurement in Calibration.
The purpose of this document is to harmonise evaluation of
uncertainty of measurement within EA, to set up, in addition to the
general requirements of EAL-R1, the specific demands in reporting
uncertainty of measurement on calibration certificates issued by
accredited laboratories and to assist accreditation bodies with a coherent
assignment of best measurement capability to calibration laboratories
accredited by them. As the rules laid down in this document are in
compliance with the recommendations of the Guide to the Expression
of Uncertainty in Measurement, published by seven international
organisations concerned with standardisation and metrology, the
implementation of EA-4/02 will also foster the global acceptance of
European results of measurement.2
By understanding and following both of these documents, a
calibration function can easily maintain traceable calibrations for
the requirements demanded by their customers and the standard or
regulation that their company needs to meet.
To maintain traceability, without using uncertainty budgets or
calculations, you must ensure your standards are at least four times (4:1)
more accurate than the test equipment being calibrated. Where does
this ratio of four to one (4:1) come from? It comes from the American
National Standard for Calibration – (ANSI/NCSL Z540.3-2006)
which states: “Where calibrations provide for verification that measurement
quantities are within specified tolerances…Where it is not practical to estimate
this probability, the TUR shall be equal to or greater than 4:1.”
So, if a TUR of equal to or greater than 4:1 is maintained, then

31
a basic quality calibration program

traceability is assured. Keep in mind that a TUR of 4:1 somewhere


along the chain of calibrations may not have been feasible, and
uncertainty calculations were performed and their uncertainty stated
on the certificate of calibration. This is correct and acceptable. In most
circumstances, where the need to maintain a TUR of 4:1 to comes into
play is at the company or shop level, where the customer’s test equipment
is usually used for production or manufacturing purposes only.
So how does calibration and traceability fit into the big picture?
What does the big picture look like? Why do you need a quality
calibration program?
You need to establish a quality calibration program to ensure that
all operations throughout the metrology department occur in a stable
manner. The effective operation of such a system will hopefully result
in stable processes and, therefore, in a consistent output from those
processes. Once stability and consistency are achieved, it is possible
to initiate process improvements. This is applicable in every phase of
a production and/or manufacturing program. But especially true in a
metrology department.3
Let’s take for example a calibration program that has six calibration
technicians on staff. Four of them work in another facility calibrating
the same types of equipment as the other two. However, the other two
have far more experience and through no fault of their own do not use
the calibration procedures that are required by their quality system.
They have calibrated the same items for several years and feel there is
nothing new to learn. One of the four calibration technicians (who
are always following the calibration procedures) finds there is a fast,
more economical way to perform a specific calibration. They submit a
change proposal for the calibration procedure and everyone is briefed
and trained on the new technique. The four calibration technicians
that have been following the calibration procedure improve their
production and save the company money. The two ‘old timers’ have a
reduction in their production and actually cost the company money. If
everyone was using the calibration procedures like they were supposed
to, then this would not have happened.
Process improvements cannot take place across the department if
everyone is not doing the job the same way each and every time they
perform a calibration.
We are not ignorant enough to believe that when calibration
technicians have performed a particular calibration hundreds or even
thousands of times that they are going to follow calibration procedures

32
a basic quality calibration program

word for word. Of course not. But they must have their calibration
procedure on hand each time they are performing the calibration. If
a change has been made to that procedure, the calibration technician
must be trained on the change before they can perform the calibration;
and the appropriate documentation completed to show that training
was accomplished and signed off. When the proper training is not
documented and signed off by the trainer and trainee, then it is the
same as if the training never happened.

What is a quality calibration program?


A quality calibration program consists of several broad items referred
to in the Quality System Regulation (QSR) from the Food and Drug
Administration (FDA). These items are also referred to by other
standards (ISO 9000, etc.) and regulations throughout most industries
that regulate or monitor production and manufacturing of all types of
products. One of the most stringent requirements can be found in the
current Good Manufacturing Procedures or cGMP.
The basic premise and foundation of a quality calibration program
is to “Say what you do, Do what you say, Record what you did, Check the
results, and Act on the difference”. Let’s break these down into simple
terms.

“Say what you do” means write in detail how to do your job. This
includes calibration procedures, work instructions and standard
operating procedures (SOPs).
“Do what you say” means follow the documented procedures or
instructions every time you calibrate, or perform a function that
follows specific written instructions.
“Record what you did” means that you must record the results of your
measurements and adjustments, including what your standard(s) read or
indicated both before and after any adjustments might be made.
“Check the results” means make certain the test equipment meets
the tolerances, accuracies, or upper/lower limits specified in your
procedures or instructions.
“Act on the difference” means if the test equipment is out of tolerance,
you’re required to inform the user/owner of the equipment because
they may have to re-evaluate manufactured goods, change a process,
or recall a product.3

33
a basic quality calibration program

“Say what you do” means write in detail how to do your job. This
includes calibration procedures, work instructions and SOPs. All of
your calibration procedures should be formatted the same as other
SOPs within your company. Here is an example of common formatting
for SOPs:
1.  Procedures
2.  Scope
3.  Responsibilities
4.  Definitions
5.  Procedure
6.  Related Procedures
7.  Forms and Records
8.  Document History
After section 4. Definitions, you should have a table listing all of the
instruments or systems that would be calibrated by that procedure,
along with their range and tolerances. After that you should have a
list of the standards to be used to calibrate the items. This table should
also include the standard’s range and specifications. Then the actual
calibration procedure starts in section 5. Procedures.
Manufacturer’s manuals usually provide an alignment procedure
that can be used as a template for writing a calibration procedure. They
should show what standards accomplish the calibration of a specific
range and/or function. A complete calibration must be performed
prior to any adjustment or alignment. An alignment procedure and/
or preventive maintenance inspection (PMI) may be incorporated
into your SOP as long as it is separate from the actual calibration
procedure.
There are, generally speaking, two types of calibration procedures:
Generic: temperature gages and thermometers, pressure and vacuum
gages, pipettes, micrometers, power supplies and water baths.
Specific: spectrophotometers, thermal cyclers, and balances/scales.
Generic SOPs are written to show how to calibrate a large variety of
items in a general context.
Specific SOPs are written to show step-by-step procedures for each
different type of test instrument within a group of items. Possibly, the
calibration form is designed to follow specific steps (number wise);
and removes doubt by the calibration technician on what data goes
into which data field.
“Do what you say” means follow the documented procedures or
instructions every time you calibrate, or perform a function that follows

34
a basic quality calibration program

specific written instructions. This means following published calibration


procedures every time you calibrate a piece of test equipment.
Have the latest edition of the procedure available for use by your
calibration technicians. Have a system in place for updating your
procedures. Train your technicians on the changes made to your
procedures every time the procedure is changed or improved – and
document the training.
What do you do when you need to make an improvement, or update
your calibration procedures and/or forms? A formal, written process
must be in place, to include:
•  Who can make changes
•  Who is the final approval authority
•  A revision tracking system
•  A process for validating the changes
•  An archiving system for old procedures
•  Instructions for posting new/removal of old procedures
•  A system for training on revisions
•  A place to document that training was done
“Record what you did” means that you must record the results of your
measurements and adjustments, including what your standard(s) read
or indicated both before and after any adjustments are made, and keep
your calibration records in a secure location. Certain requirements
must be documented in each calibration record. Of course there are
many ways to accomplish this, including:
•  pen and paper
•  “do-it-yourself ” databases, e.g. Excel, Access
•  calibration module of a computerized maintenance management
system (CMMS)
•  calibration software specifically designed for that purpose
These include the identification of the test instrument with a unique
identification number, their part number and range/tolerance. The
location of where the test instrument can be found should also be on
the record. A history of each calibration and a traceability statement
or uncertainty budget must be included. The date of calibration, the
last time it was calibrated, and the next time it will be due calibration
should be on the form. There should be a place to show what the
standard read, as well as the test instrument’s ‘As Found’ and when
applicable ‘As Left’ readings.
The ‘As Found’ readings are what the test instrument read the first time
that a calibration is performed, prior to alignment, adjustment or repair.

35
a basic quality calibration program

The entire calibration is performed to see any part of the calibration


is out of tolerance. If an out-of-tolerance (OOT) condition is found,
record the reading (on the standard and the UUT) and continue with
the rest of the calibration to the end of the calibration procedure. If one
were to stop at the point where an OOT is found, make an adjustment,
then proceed with the calibration, there is a good possibility that the
adjustment affected other ranges or parts of the calibration. This is why
the entire calibration is performed prior to adjustment or alignment.
There will be times when an instrument has a catastrophic failure.
It just dies and cannot be calibrated. This should be noted in the
calibration record. Then, once the problem is found and repaired, an
‘As Found’ calibration is performed. The UUT is treated the same as
any OOT unit, but you would not have been able to collect the original
“As Found’ readings.
“As Left” readings are taken after repair, alignment, or adjustment.
Not all UUTs would be considered OOT when “As Left’ readings
are taken. In some circumstances, it might be metrology department
policy to adjust an item if it is more than ½ beyond its in-tolerance
range, while still meeting its specifications. In this type of situation,
after the UUT is adjusted to be as close to optimum as possible, a
complete calibration is again performed, collecting the ‘As Left’
readings for the final calibration record. Another example would be
when preventive maintenance inspection is going to be performed on
an item. The calibration is performed, collecting the ‘As Found’ data.
Then the PMI is completed, and an ‘As Left’ set of data is collected.
If the item is found to be out-of-tolerance at that time, there would
not be a problem since it was found to be in tolerance during the first
calibration. It would be obvious that something happened during the
cleaning, alignment or adjustment and that after a final adjustment
was completed to bring the unit back into tolerance, a final ‘As Left’
calibration would be performed.
The standard reading, from the working or reference standard you are
using to calibrate the UUT, will also be recorded on the calibration form.
Usually, the standard is set at a predetermined output, and the UUT is
read to see how much it deviates from the standard. This is a best practice
policy that has been in use in the metrology community since calibration
started. However, there will be times when this is not possible.
One example when it would not be practical to set the standard and
take a reading is during the calibration of water baths. The water bath
is set to a predetermined temperature, and the temperature standard is

36
a basic quality calibration program

used to record the actual reading. Compare this to the calibration of


pressure gages where a pressure standard is set to a standard pressure,
and the gage(s) under test are then read, and their pressures recorded on
the calibration record, and compared to the standard to see if they are
in or out of tolerance. In other case, just as the calibration of autoclaves,
they are set to complete a sterilization cycle and a temperature device
records all of the temperature readings throughout the cycle and the
readings are checked to see if the autoclave met its specifications. The
same happens when calibrating thermometers. They, along with the
standard, are placed in a dry block and a particular temperature is
set. The UUT is compared to the reference after equilibration, and a
determination is made as to the in or out of tolerance of the UUT. As
can be seen by the above examples, it is not always possible to set the
standard and take a reading from the UUT.
Also on the calibration form should be an area to identify the
standard(s) that were used, along with their next calibration due
date(s), plus their specifications and range.
There should also be a place to identify which calibration procedure
was used, along with the procedure’s revision number. There must be
a statement showing traceability to your NMI, or in the case of most
companies in the USA, to NIST, or to any artifact that was used as a
standard.
You should include any uncertainty budgets if used, or at least a
statement that a TUR of ≥ 4:1 was met.
List environment conditions when appropriate and show if they pass
or fail. According to NCSL International Calibration Control Systems
for the Biomedical and Pharmaceutical Industry – Recommended
Practice RP-6, paragraph 5.11: “The calibration environment need be
controlled only to the extent required by the most environmentally
sensitive measurement performed in the area.”4
According to ANSI/NCSL Z540.3-2006, paragraph 5.3.6 Influence
factors and conditions: “All factors and conditions of the calibration
area that adversely influence the calibration results shall be defined,
monitored, recorded, and mitigated to meet calibration process
requirements. Note: Influencing factors and conditions may include
temperature, humidity, vibration, electromagnetic interference, dust, etc.
Calibration shall be stopped when the adverse effects of the influence
factors ad conditions jeopardize the results of the calibration.”5
If the conditions within the area that calibrations are being performed
require monitoring according to the standard or requirements that must

37
a basic quality calibration program

be met, then a formal program must be in place for tracking those


conditions and reviewing the data. If this is the case, then there should
be a place in the calibration form for showing that those conditions were
either met, were not met, or are not applicable to that calibration.
You should indicate on the form if the calibration passed or failed.
If the UUT had an out-of-tolerance condition, then there should
be a place to show what happened to the UUT, with the following
possibilities as an example: 
•  The user/customer was notified and the UUT was adjusted and
meets specifications.
•  The user/customer was notified and the UUT was given a ‘limited
calibration’ with their written approval.
•  The user/customer was notified and the UUT was taken out of
service and tagged as unusable.
Notice that in each circumstance that the user/customer must be
notified of any and all OOTs. This is called for in all of the standards
and regulations. The user/customer, even if internal to the company
performing the calibrations, must be informed if their test equipment
does not meet their specifications.
There should be an area set aside in the calibration form for making
comments or remarks. Enough space should be available for the
calibration technician to include information about the calibration,
OOT conditions, what was accomplished if an OOT was found, etc.
And finally, the calibration record must be signed and dated by
the technician performing the calibration. In some instances, the
calibration record requires a ‘second set of eyes’. This means that an
individual higher up the chain of command (supervisor, manager,
QA inspector, etc.) must review the calibration record and also sign
and date that it has been reviewed, audited, or inspected before it is
considered a completed record. If this is the case, there should be a
place on the form for the final reviewer to sign and date.
What do you do if, after recording your results, you find that you
have made an error, or transposed the wrong numbers, and want to
correct the error? For hard copy records, draw a single line through
the entry, write the correct data, and then place your initials and date
next to the data using black ink. Do not use white-out, or erase the
original data. For making corrections to electronic records (eRecords),
use whatever tracking system the software uses; or make a duplicate
record from scratch with the correct data and explain in the comments
block what happened, and date and sign accordingly.

38
a basic quality calibration program

There should be only one way to file your records, both hard copy
and eRecords – no matter which system you use, put it into your
written procedures.

An example for filing hard copy records:


•  Each record is filed by its unique ID number
•  Records are filed with the newest in the front
•  Records are filed within a specified time frame

An example for filing eRecords:


•  Filed by ID number, calibration certificate number and calibration
date
•  Placed on a secure drive that has regular backup
•  eRecords are filed within a specified time frame

There are many different ways to manage your calibration data since
there are a variety of ways to collect that data. Hard copy records collected
during the calibration of test instruments have been discussed in detail
already. But the collection of data by electronic means, or through the
use of calibration software, process controllers, etc., should also be
considered. Is the system validated and instrumentation qualified prior
to use? If you are using any type of computerized system, validation of
that software is mandatory. How is the data collected and stored? Is it in
its native format or dumped into a spreadsheet for analysis? All of these
need to be considered to allow for review, analysis, and/or compilation
into your forms, and eventual storage.
The use of computerized data collection brings with it not only
increased productivity and savings in time and effort; but also new
problems in how to collect, manage, review and store the data. It cannot
be emphasized enough the criticality of validating your software, data
lines and storage systems when going entirely electronic with your
calibration records and data management.
“Check the results” means make certain the test equipment meets
the tolerances, accuracies, or upper/lower limits specified in your
procedures or instructions.
There are various ways to do this. Calibration forms should have
the range and their tolerances listed for each piece of test equipment
being calibrated. In some instances it is apparent what the tolerances
will be for the items being calibrated. In other cases it is not quite so
apparent.

39
a basic quality calibration program

“Act on the difference” means if the test equipment is out of tolerance,


you must inform the user because they may have to re-evaluate
manufactured goods, change a process or procedure, or recall product.
According to the FDA: “When accuracy and precision limits are not met,
there shall be provisions for remedial action to reestablish the limits and to
evaluate whether there was any adverse effect on the device’s quality.”

You should have a written procedure in place that explains in detail:


•  What actions are to be taken by the calibration technician?
•  What actions to be taken by the department supervisor and/or
manager?
•  What actions to be taken by the responsible owner/user of the
OOT test equipment?
You should have an SOP that explains the responsibilities of the
calibration technician:
•  Do they have additional form(s) to complete when OOT
conditions are found?
•  Do they require a ‘second set of eyes’ when/if an OOT is found?
•  Have they been trained and signed off that they know all the
proper procedures when an OOT has been found?
You should have an SOP that explains the responsibilities of the
supervisor/manager:
•  W ho notifies the customer – the technician, supervisor or
manager?
•  Is a data base maintained on all OOT test equipment?
•  Is the customer/user required to reply to the OOT notification; if
so is there a time limit, and a paper trail for historical reference?
After owner/user notification, is the calibration department responsible
for anything else?
•  Is the final action by the owner/user sent back for filing or
archiving?
•  Usually the department that generates an action item is responsible
for final archiving.
•  Are there any databases that need to be updated; or upper
management notification in case of ‘in action’?
Do you have a database of all OOT test equipment for various
activities?
•  The database can be used for accessing yearly calibration interval
analysis.
•  Access to OOT data can assist in determining reliability of test
equipment.

40
a basic quality calibration program

•  During an audit/inspection (both internal and external) access to


past OOT data should be easily available.
Here is a hypothetical example: from an historical perspective,
generally 85% of test equipment passes calibration…
•  A mong the 15% that are found to be OOT some will be due
to operator error, bad standards, bad cables/accessories, poorly
written calibration procedures, environmental conditions
(vibration, etc.).
•  If a higher fail rate is noticed, before changing calibration intervals,
check that the proper specifications are being used.

Typical calibration process as shown in a flow chart

Start

‘As found’
test

Save
‘As found’
results

NO Adjustment YES Adjust


required? as needed

‘As Left’ YES Within NO


test limits?

Save
‘As Left’
results

End

41
a basic quality calibration program

Developing a world-class calibration program


A quality calibration program might be compared to an iceberg. Only
about 10% can be easily seen by the casual observer. However, the
unseen portion is what keeps the iceberg afloat and stable in the ocean.
The same can be said of a quality calibration program. The “Say what
you do, Do what you say, Record what you did, Check the results, and Act
on the difference” portion, along with traceability should be apparent
to an auditor or inspector. But the different parts that keep a quality
calibration program running efficiently consist of elements from a
continuous process improvement program, scheduling and calibration
management software, an effective training program, a comprehensive
calibration analysis program, correct and properly used calibration
and equipment labels, and a visible safety program. Without any one
of these programs, a quality calibration program would be impossible
to maintain.
Having an effective calibration management program is usually the
difference between being proactive and reactive to performing your
routine calibrations. By knowing what is coming due calibration, you
can schedule your technicians, standards, time and other resources
to the best advantage. This can be compared to the person who is
trying to drain the swamp while fighting off the alligators. It is hard
to keep your overdue calibrations at a minimum when all of your time
is spent reacting to items that keep coming due without your prior
knowledge. Any calibration management program worth the money
should have a few critical areas built into their basic program. Those
include: a master inventory list, reverse traceability, the ability to see
a 30 day schedule of items coming due calibration, and the ability to
see all items that are currently overdue calibration. From a managerial
standpoint, the calibration management program should also be able
to show calibrations and repairs by individual items, groups of items
by location/part number, items that are OOT, and other listings that
help to manage your department. According to most standards and
regulations, any software program used must be validated prior to
implementation. This can be accomplished using the manufacturer’s
system, or by incorporating an in-house validation system. Either way,
your validation paperwork needs to be available for inspection during
audits and inspections.
A best practice among experienced calibration practitioners is the
calibration of like items, and using your scheduling software to also

42
a basic quality calibration program

perform calibrations in geographical areas or combining calibrations


in local areas. An example of this would be to calibrate all pressure
gages that were shown to be stored or used in a specific area, or floor of
a building. This would be using your time to the best advantage. Also,
if calibrations were to be performed in a ‘clean room’ environment,
and the calibration technician is required to gown-up prior to entry
every time then go into the clean-room, then scheduling all of the
calibrations in that area could increase production and reduce down
time from multiple entries and exits.
Combining the calibration of like items and mixing and matching
items could reduce the task of mundane and boring calibrations. An
example would be to start all temperature calibrations (set water baths
up for their initial temperature readings), then perform several pipette
or balance calibrations, return to set another temperature in the water
baths (doing a few at a time), return to finish the pipette or balance
calibrations, then complete the water baths at their final setting. By not
having to stand around to wait for the water baths to equilibrate, you are
using your time more efficiently, increasing productivity, and keeping
the calibration technician involved and focused instead of bored.
Another critical yet often times misunderstood program is
calibration interval analysis. How often should each type of test
equipment be calibrated? Should the manufacturer’s recommended
interval be the determining factor? Or should the criticality of how the
test equipment is used in your particular production or manufacturing
line be the deciding vote? Your specific situation should be the driving
factor in deciding calibration interval analysis. Most manufacturers
recommend a 12 month calibration interval, depending on usage,
environment, handling, etc. A particular item used in a controlled
environment should be more reliable that one used in a harsher
situation, say outdoors in severe weather. Also, you must consider if the
test equipment is used to determine final product where specifications
are very tight, or used as an item that is coded as “No Calibration
Required” on a loading dock. Each situation should be considered
carefully so that they can be reviewed in the appropriate light.
Calibration interval analysis software can be purchased commercially
and used to evaluate your test equipment. Also, NCSL International
has RP-1, Establishment & Adjustment of Calibration Intervals. This
Recommended Practice (RP) is intended to provide a guide for the
establishment and adjustment of calibration intervals for equipment
subject to periodic calibration. It provides information needed to

43
a basic quality calibration program

design, implement and manage calibration interval determination,


adjustment and evaluation programs. Both management and technical
information are presented in this RP. Several methods of calibration
interval analysis and adjustment are presented. The advantages and
disadvantages of each method are described and guidelines are given
to assist in selecting the best method for a requiring organization.
A company could also do their own analysis if they support a limited
number of items, or are on a tight budget and are willing to do their
own computations. Here is an example.
•  For each type of equipment, collect data over a one year period
on: number of calibrations and number of items OOT
•  Take the number of calibrations minus the number of OOTs,
divide result by the number of calibrations, then take the result
times 100 for the pass rate
•  Make a risk assessment of each item for your company’s needs; set
a cut off for increasing or decreasing calibration intervals
•  Consider increasing a calibration interval if the pass rate ≥ 95%
(by ½ up to double the current calibration interval)
•  Consider decreasing a calibration interval if the pass rate ≤ 85%
(by ¾ to ½ of the current calibration interval)
No matter which route you take for calibration interval analysis
– ensure you are on the cutting-edge – not on the ragged-edge by
extending your intervals too fast without solid data; recalls can be very
expensive, in time and money, and to your company’s reputation!

The cost and risk of not calibrating


Are there costs and/or risks associated to not calibrating your test
equipment? This is a double edged sword. On one side we have the
requirement of standards and regulations that govern various companies,
industries and even countries. Not only is calibration a requirement, but
one of the foundations for any quality system in the 21st century. It isn’t
a question of do you have a quality calibration program in place, but
does it comply with all the requirements of the appropriate standard or
regulation to which your company must conform?
The other side of the double edged sword is having a calibration
program in place without any type of quality, traceability or
documentation. This would equate to not having any type of
calibration program at all. If a manufacturer produces any type of
product or service where repeatable measurements take place then

44
a basic quality calibration program

their test equipment/instruments need to have repetitive outputs.


Without calibration to a traceable standard (national, international,
or intrinsic), there can be no repeatability. Therefore there can be no
quality in the product, so the company would never be able to stay in
business long enough to impact their market segment.
So is there cost and risk? Absolutely. The cost is huge in terms of lost
production, time, money, and reputation. In the case of companies
that have untraceable calibration in the production of medical devices,
pharmaceutical drugs and products that impact human safety – the
cost could be immeasurable…with the possibility of death among the
results.
The basic belief is this – it is absolutely essential to have a quality
calibration program in place to make a quality product, no matter
the size, shape, or quantity. The question that should be asked is:
“Do you have a quality calibration program that has traceable results
to a national or international standard”? If the answer is yes, then it
is assumed that to have a quality calibration program, you must also
have all the parts needed to support traceable calibration: calibration
procedures, calibration records, traceable documentation, an out-of-
tolerance program and procedures, document control procedures, a
training program, continuous process improvements, a comprehensive
calibration management software package, calibration interval
analysis, documented training for all your calibration technicians,
and the ability to provide quality customer service in a timely manner.
Then you can say you have a quality calibration program.
But it doesn’t end there. Referring to a double edged sword, what
are the responsibilities of a quality calibration department and also
those of their customer?
A calibration/metrology department should be responsible for:
•  Listening to their customers to understand their requirements
and needs
•  Translating those requirements to the accuracy and specifications
of the test equipment and support services that meet or exceed
their quality expectations
•  Delivering test equipment that consistently meets requirements
for reliable performance
•  Providing knowledgeable and comprehensive test equipment
support
•  Continuously reviewing and improving their services and
processes

45
a basic quality calibration program

Your customers should be responsible for:


•  Informing Metrology of their requirements and needs
•  Getting the proper training in the correct and safe usage of test
equipment
•  Maintaining their test equipment without abusing, contaminating
or damaging it under normal operating conditions
•  Using their work order system for requesting service when
equipment is broken, malfunctioning, or in need of calibration

As Lord Kelvin was quoted as saying, “If you cannot measure it, you
cannot improve it.”

1. EAL-G12, Traceability of Measurement. Edition 1, November


1995.
2. EA-4/02, Expression of the Uncertainty of Measurement in
Calibration. December 1999 rev00.
3. Bucher, Jay L. 2007. The Quality Calibration Handbook.
Milwaukee: ASQ Quality Press.
4. NCSL. 1999. Calibration Control Systems for the Biomedical and
Pharmaceutical Industry, RP-6. Boulder, CO.
5. NCSL International. 2006. ANSI/NCSL Z540.3-2006. Boulder,
CO.

46
PART 2
traceable and efficient calibrations

Traceable and efficient calibrations


in the process industry

1.  Introduction

T
oday’s modern process plants, production processes and quality Fortunately, modern
systems, put new and tight requirements on the accuracy of calibration techniques
process instruments and on process control.
Quality systems, such as the ISO9000 and ISO14000 series of quality and calibration systems
standards, call for systematic and well-documented calibrations, with have made it easier to
regard to accuracy, repeatability, uncertainty, confidence levels etc.
fulfill the requirements on
Does this mean that the electricians and instrumentation people
should be calibration experts? Not really, but this topic should instrumentation calibration
not be ignored. Fortunately, modern calibration techniques and and maintenance in
calibration systems have made it easier to fulfill the requirements on
instrumentation calibration and maintenance in a productive way. a productive way.
However, some understanding of the techniques, terminology and
methods involved in calibration must be known and understood in
order to perform according to International Quality Systems.

2.  What is calibration and why calibrate


Calibration can be briefly described as an activity where the instrument
being tested is compared to a known reference value, i.e. calibrator. The
keywords here are ‘known reference’, which means that the calibrator
used should have a valid, traceable calibration certificate.
To be able to answer the question why calibrate, we must first
determine what measurement is and why measuring is necessary.

51
traceable and efficient calibrations

WHAT IS MEASUREMENT?
In technical standards terms the word measurement has been
defined as:
“A set of experimental operations for the purpose of determining
the value of a quantity.”

What is then the value of quantity? According to the standards


the true value of a quantity is:
“The value which characterizes a quantity perfectly defined
during the conditions which exist at the moment when the value
is observed. Note: the true value of a quantity is an ideal concept
and, in general, it cannot be known.”

Therefore all instruments display false indications!

A set of experimental
operations for the purpose
HIERARCHY
OF ACCURACY
of determining the value of
a quantity. TRUE VALUE

International
National standard

Authorized
Laboratories

Instr. Departments
House and working standards

Process instrumentation

52
traceable and efficient calibrations

3.  Why measure?


The purpose of a process plant is to convert raw material, energy, The purpose of a process
manpower and capital into products in the best possible way. This plant is to convert raw
conversion always involves optimizing, which must be done better
than the competitors. In practice, optimization is done by means of material, energy, manpower
process automation. Anyhow, regardless of how advanced the process and capital into products in
automation system is, the control cannot be better than the quality of
the best possible way.
measurements from the process.

4.  Why calibrate


The primary reason for calibrating is based on the fact that even the
best measuring instruments lack in absolute stability, in other words,

EVERYTHING IS BASED
ON MEASUREMENTS

PROCESS CONTROL SYSTEM

MEASUREMENTS CONTROLS

INSTRUMENTATION

MEASUREMENTS ADJUSTMENTS

Production PROCESS Products


Factors

53
traceable and efficient calibrations

they drift and lose their ability to give accurate measurements. This
drift makes recalibration necessary.
Environment conditions, elapsed time and type of application can
all affect the stability of an instrument. Even instruments of the same
manufacturer, type and range can show varying performance. One
unit can be found to have good stability, while another performs
differently.

Other good reasons for calibration are:

•  To maintain the credibility of measurements


•  To maintain the quality of process instruments
Environment conditions, at a good-as-new level
elapsed time and type of •  Safety and environmental regulations
•  ISO9000, other quality systems and regulations
application can all affect the
stability of an instrument. The ISO9000 and ISO14000 can assist in guiding regular, systematic
calibrations, which produces uniform quality and minimizes the
negative impacts on the environment.

QUALITY MAINTENANCE
QUALITY “GOOD AS NEW”
C1–C7 CALIBRATIONS
C1 C2 C3 C4 C5 C6 C7
QP
QM

Q1 LOWER
TOLERANCE
Q2
Q3

QZM

PURCHASE T1 T2 T3 TIME

QP – PURCHASED QUALITY
QZM – ZERO MAINTAINED QUALITY
QM – MAINTAINED QUALITY

54
traceable and efficient calibrations

5.  Traceability
Calibrations must be traceable. Traceability is a declaration stating to
which national standard a certain instrument has been compared.

6.  Regulatory requirements for calibration

6.1  ISO9001: 2000


The organization determines the monitoring and measurements to
be performed, as well as the measuring devices needed to provide
evidence of a product’s conformity to determined standards.
The organization establishes the processes for ensuring that SI-UNITS
measurements and monitoring are carried out and are carried out
in a manner consistent with the monitoring and measurement
requirements. International
Where necessary, to ensure valid results, measuring equipment standards

is calibrated or verified with measurement standards traceable to


national or international standards at specified intervals. If no National
such standards exist, the basis used for calibration or verification standards
is recorded; adjusted or re-adjusted as necessary; identified for the
determining of the calibration status; safeguarded against adjustments Reference
that would invalidate the measurement result; protected from damage standards
and deterioration during handling, maintenance and storage.
In addition, the organization assesses and records the validity of
Working
the previous measuring results when the equipment is found not to standards
conform to requirements. The organization then takes appropriate
action on the equipment and any product affected. Records of the
calibration and verification results are then maintained. Process
standards
When used in the monitoring and measurement of specified
requirements, the ability of computer software to satisfy the intended
application is confirmed. This is done prior to initial use and
reconfirmed as necessary.

Note: See ISO 10012 for further information.

55
traceable and efficient calibrations

6.2  PHARMACEUTICAL (FDA, U.S. Food and Drug Administration)


Any pharmaceutical company that sells their products in the USA
must comply with FDA regulations regardless where the products are
manufactured.

•  Calibration records must be maintained.


•  Calibrations must be done according to written, approved
procedures.
•  Each instrument should have a master history record.
•  All instrumentation should have a unique ID; all product, process
and safety instruments should be physically tagged.
Software systems need •  A calibration period and error limits should be defined for each
features such as Electronic instrument.
•  Standards should be traceable to national and international
Signature, Audit Trail, standards.
User Management, •  Standards must be more accurate than the required accuracy of
the equipment being calibrated.
and Security System
•  All instruments used must be fit for purpose.
to be able to comply •  There must be documented evidence that personnel involved in
with these regulations. the calibration process have been trained and are competent.
•  Documented change management system must be in place.
•  All electronic systems must comply with FDA’s regulation 21 CFR
Part 11.
•  All of the above should be implemented in conjunction with
following regulations:
–  21 CFR Part 211 – ”Current Good Manufacturing Practice for
Finished Pharmaceuticals”
–  21 CFR Part 11 – ”Electronic Records; Electronic Signatures”

Software systems need features such as Electronic Signature, Audit


Trail, User Management, and Security System to be able to comply
with these regulations.
In such a system, the Electronic Signature is considered equivalent to
a hand-written signature. Users must understand their responsibilities
once they give an electronic signature. The Audit Trail is required for
change management. It must be a tool that records all modifications,
which add, edit, or delete data from an electronic record.

56
traceable and efficient calibrations

7.  DEFINITIONS OF METROLOGICAL TERMS


Some metrological terms in association with the concept of calibration
are described in this section.
Quite a few of the following terms are also used on specification
sheets for calibrators. Please note that the definitions listed here are
simplified.

Calibration
An unknown measured signal is compared to a known reference
signal.
Validation of measure­
Validation ment and test methods
Validation of measurement and test methods (procedures) is generally (procedures) is generally
necessary to prove that the methods are suitable for the intended necessary to prove that
use.
the methods are suitable
for the intended use.
Non-linearity
Non-linearity is the maximum deviation of a transducer’s output from
a defined straight line. Non-linearity is specified by the Terminal
Based method or the Best Fit Straight Line method.

Resolution
Resolution is the smallest interval that can be read between two
readings.

Sensitivity
Sensitivity is the smallest variation in input, which can be detected as
an output. Good resolution is required in order to detect sensitivity.

Hysteresis
The deviation in output at any point within the instrument’s sensing
range, when first approaching this point with increasing values, and
then with decreasing values.

57
traceable and efficient calibrations

Repeatability
Repeatability is the capability of an instrument to give the same
output among repeated inputs of the same value over a period of time.
Repeatability is often expressed in the form of standard deviation.

Temperature coefficient
The change in a calibrator’s accuracy caused by changes in ambient
temperature (deviation from reference conditions). The temperature
coefficient is usually expressed as % F.S. / °C or % of RDG/ °C.

…stability is expressed as
Stability
the change in percentage
Often referred to as drift, stability is expressed as the change in
in the calibrated output percentage in the calibrated output of an instrument over a specified
of an instrument over a period, usually 90 days to 12 months, under normal operating
conditions. Drift is usually given as a typical value.
specified period, usually
90 days to 12 months,
Accuracy
under normal operating
Generally accuracy figures state the closeness of a measured value
conditions. to a known reference value. The accuracy of the reference value is
generally not included in the figures. It must also be checked if errors
like non-linearity, hysteresis, temperature effects etc. are included in
the accuracy figures provided.
Accuracy is usually expressed % F.S. or % of RDG + adder.
The difference between these two expressions is great. The figure
below illustrates two typical pressure specifications:
0.05%FS (blue line)
0.025% of RDG + 0.01%FS (red line)
The only way to compare accuracy presented in different ways is to
calculate the total error at certain points.

Uncertainty
Uncertainty is an estimate of the limits, at a given cover factor (or
confidence level), which contain the true value.
Uncertainty is evaluated according to either a “Type A” or a “Type
B” method. Type A involves the statistical analysis of a series of

58
traceable and efficient calibrations

measurements. In this case, uncertainty is calculated using Type A


uncertainties, i.e. the effects of these components include measurement
errors, which can vary in magnitude and in sign, in an unpredictable
manner. The other group of components, Type B, could be said to be of
a systematic nature. Systematic errors or effects remain constant during
the measurement. Examples of systematic effects include errors in
reference value, set-up of the measuring, ambient conditions, etc. Type
B uncertainty is used when the uncertainty of a single measurement
is expressed.
It should be noted that, in general, errors due to observer fallibility
cannot be accommodated within the calculation of uncertainty.
Examples of such errors include: errors in recording data, errors in
calculation, or the use of inappropriate technology. It should be noted that,
in general, errors due to
Type A uncertainty observer fallibility cannot be
The type A method of calculation can be applied when several accommodated within the
independent measurements have been made under the same
calculation of uncertainty.
conditions. If there is sufficient resolution in the measurement, there
will be an observable difference in the values measured.
The standard deviation, often called the “root-mean-square
repeatability error”, for a series of measurements under the same
conditions, is used for calculation. Standard deviation is used as a
measure of the dispersion of values.

Type B uncertainty
Type B evaluation of uncertainty involves the use of other means
to calculate uncertainty, rather than applying statistical analysis of a
series of measurements.
It involves the evaluation of uncertainty using scientific judgement
based on all available information concerning the possible variables.
Values belonging to this category may be derived from:

•  Experience with or general knowledge of the behavior and properties


of relevant materials and instruments
•  Ambient temperature
•  Humidity
•  Local gravity
•  Atmospheric pressure

59
traceable and efficient calibrations

•  Uncertainty of the calibration standard


•  Calibration procedures
•  Method used to register calibration results
•  Method to process calibration results

The proper use of the available information calls for insight based
on experience and general knowledge. It is a skill that can be learnt
with practice. A well-based Type B evaluation of uncertainty can
be as reliable as a Type A evaluation of uncertainty, especially in
a measurement situation where a Type A evaluation is based only
on a comparatively small number of statistically independent
measurements.
For uncertainty
specifications, there Expanded uncertainty
must be a clear state- The EA has decided that calibration laboratories accredited by members
­ment of cover probability of the EA shall state an expanded uncertainty of measurement obtained
by multiplying the uncertainty by a coverage factor k. In cases where
or confidence level.
normal (Gaussian) distribution can be assumed, the standard coverage
factor, k=2, should be used. The expanded uncertainty corresponds to
a coverage probability (or confidence level) of approximately 95%.
For uncertainty specifications, there must be a clear statement of
cover probability or confidence level. Usually one of the following
confidence levels are used:
1 s = 68%
2 s = 95%
3 s = 99%

8.  CALIBRATION MANAGEMENT


Many companies do not pay enough attention to calibration
management although it is a requirement e.g. in ISO9001: 2000. The
maintenance management system may alert when calibration is needed
and then opens up a work order. Once the job has been done, the work
order will close and the maintenance system will be satisfied.
Unfortunately, what happens between opening and closing of the
work order is not documented very often. If something is documented,
it is usually in the form of a hand-written sheet that is then archived.
If the calibration results need to be examined at a later time, finding
the sheets requires a lot of effort.

60
traceable and efficient calibrations

Choosing professional tools for maintaining calibration records


and doing the calibrations can save a lot of time, effort and money.
An efficient calibration management system consists of calibration
management software and documenting calibrators.
Modern calibration management software can be a tool that
automates and simplifies calibration work at all levels. It automatically
creates a list of instruments waiting to be calibrated in the near future.
If the software is able to interface with other systems the scheduling
of calibrations can be done in the maintenance system from which
the work orders can be automatically loaded into the calibration
management software.
When the technician is about to calibrate an instrument, (s)he simply
downloads the instrument details from the calibration management The instrument’s
software into the memory of a documenting calibrator; no printed measurement ranges
notes, etc. are needed. The “As Found” and “As Left” are saved in the
calibrator’s memory, and there is no need to write down anything and error limits are
with pen. defined in the software
The instrument’s measurement ranges and error limits are defined in
and also downloaded
the software and also downloaded to the calibrator. Thus the calibrator
is able to detect if the calibration was passed or failed immediately after to the calibrator.
the last calibration point was recorded. There is no need to make tricky
calculations manually in the field.
All this saves an extensive amount of time and prevents the user
from making mistakes. The increase in work productivity allows for
more calibrations to be carried out within the same period of time
as before. Depending on what process variable is calibrated and how
many calibration points are recorded, using automated tools can be 5
to 10 times faster compared to manual recording.
While the calibration results are uploaded onto the database, the
software automatically detects the calibrator that was used, and the
traceability chain is documented without requiring any further actions
from the user.
Calibration records, including the full calibration history of an
instrument, are kept in the database; therefore accessing previous
results is also possible in just a few seconds. When an instrument has
been calibrated several times, software displays the “History Trend”,
which assists in determining whether or not the calibration period
should be changed.
One of today’s trends is to move towards to a paperless office. If the
calibration management software includes the right tools, it is possible

61
traceable and efficient calibrations

to manage calibration records on computer without producing any


papers. If paper copies of certificates are preferred, printing them must,
of course, be possible. When all calibration related data is located in
a single database the software is obviously able to create calibration
related reports and documents.
Today’s documenting calibrators are capable of calibrating many
process signals. It is not very uncommon to have a calibrator that
calibrates pressure, temperature and electrical signals including
frequency and pulses. In addition to the conventional mA output of
a transmitter, modern calibrators can also read HART, Foundation
Fieldbus or Profibus output of the transmitters, and they can be even
used for configuring these “smart” transmitters.
Implementing a modern Implementing a modern calibration management system benefits
calibration management everybody who has anything to do with instrumentation. For instance
the maintenance manager can use it as a calibration planning and
system benefits everybody decision-making tool for tracking and managing all calibration related
who has anything to do activities.
When an auditor comes for a visit, QA will find a calibration
with instrumentation.
management system useful. The requested calibration records can
be viewed on screen with a couple mouse clicks. If a calibrator drifts
out of its specifications, it is possible to use a “reverse traceability
report” to get a list of instruments that have been calibrated with that
calibrator.
Good calibration tools help technicians work more efficiently and
accurately. If the system manufacturer has paid attention usability,
the system is easy to learn and use. When many tasks are automated,
the users can concentrate on their primary job.
Transferring to a new calibration system may sound like a huge task
and it can be a huge task. There are probably thousands of instruments
that need to be entered into the database and all the details must be
checked and verified before the system is up and running. Although
there is a lot of data involved, it does not mean the job is an enormous
one.
Nowadays most companies have instrumentation data in some type
of electronic format: as Excel spreadsheets, Maintenance databases,
etc. The vendor of the calibration system is most likely able to import
most of the existing data to the calibration database saving months
of work.

62
traceable and efficient calibrations

CONCLUSION

A good, automated calibration system reduces workload


because it carries out tasks faster, more accurately and with
better results than what could be reached with a manual system.
It assists in documenting, scheduling, planning, analyzing and
finally optimizing the calibration work.

References
[1] ISO9001: 2000 “Quality Management Systems.
Requirements”
[2] 21 CFR Part 11: “Electronic Records; A good, automated
Electronic Signatures”
calibration system
[3] 21 CFR Part 211: “Current Good Manufacturing Practice
for Finished Pharmaceuticals” reduces workload.

63
how often should instruments be calibrated

How often should


instruments be calibrated

Plants can improve their efficiency and reduce costs Plants can improve their
by performing calibration history trend analysis. By efficiencies and reduce
doing it, a plant is able to define which instruments
costs by using calibration
can be calibrated less frequently and which should be
calibrated more frequently. Calibration history trend ‘history trend analysis’,
analysis is only possible with calibration software a function available within
that provides this functionality. Beamex® CMX calibration
software.
Using Calibration History Trend Analysis to Adjust Calibration
Intervals of Plant Instrumentation
Manufacturing plants need to be absolutely confident that their
instrumentation products – temperature sensors, pressure transducers,
flow meters and the like – are performing and measuring to specified
tolerances. If sensors drift out of their specification range, the
consequences can be disastrous for a plant, resulting in costly production
downtime, safety issues or possibly leading to batches of inferior quality
goods being produced, which then have to be scrapped.
Most process manufacturing plants will have some sort of
maintenance plan or schedule in place, which ensures that all
instruments used across the site are calibrated at the appropriate times.
However, with increasing demands and cost issues being placed on
manufacturers these days, the time and resources required to carry
out these calibration checks are often scarce. This can sometimes lead
to instruments being prioritised for calibration, with those deemed

65
how often should instruments be calibrated

critical enough receiving the required regular checks, but for other
sensors that are deemed less critical to production, being calibrated
less frequently or not at all.
But plants can improve their efficiencies and reduce costs by
using calibration ‘history trend analysis’, a function available within
Beamex® CMX calibration software. With this function, the plant
can analyze whether it should increase or decrease the calibration
frequency for all its instruments.
Cost savings can be achieved in several ways. First, by calibrating
less frequently where instruments appear to be highly stable according
to their calibration history. Second, by calibrating instruments more
often when they are located in critical areas of the plant, ensuring
Sensors that are found to that instruments are checked and corrected before they drift out
be highly stable need not of tolerance. This type of practise is common in companies that
employ an effective ‘Preventive Maintenance’ regime. The analyses
be re-calibrated as often as of historical trends and how a pressure sensor, for example, drifts in
sensors that tend to drift. and out of tolerance over a given time period, is only possible with
calibration software that provides this type of functionality.

Current Practices in Process Plants


But in reality, how often do process plants actually calibrate their
instruments and how does a maintenance manager or engineer know
how often to calibrate a particular sensor?
In July 2007, Beamex conducted a survey that asked process
manufacturing companies how many instruments in their plant
required calibrating and the frequency with which these instruments
had to be calibrated. The survey covered all industry sectors, including
pharmaceuticals, chemicals, food and beverage, oil and gas, paper
and pulp.
Interestingly, the survey showed that from all industry sectors, 50%
of the respondents said they calibrated their instruments no more than
once a year.
However, in the pharmaceuticals sector, 42% said they calibrated
once a year and 42% said they calibrated twice a year.
Perhaps unsurprisingly, due to it being a highly regulated industry, the
study proved also that the pharmaceuticals sector typically possesses
a significantly higher number of instruments per plant that require
calibrating. In addition, these plants also calibrate their instruments
more frequently than other industry sectors.

66
how often should instruments be calibrated

The Benefits of Analyzing Calibration History Trends


But regardless of the industry sector, by analysing an instrument’s
drift over time (ie. the historical trend) companies can reduce costs
and improve their efficiencies. Pertti Mäki is Area Sales Manager
at Beamex in Finland. He specialises in selling the Beamex® CMX
to different customers across all industry sectors. He comments:
“The largest savings from using the History Trend Option are in the
pharmaceuticals sector, without doubt, but all industry sectors can
benefit from using the software tool, which helps companies identify
the optimal calibration intervals for instruments.”
The trick, says Mäki, is determining which sensors should be re-
calibrated after a few days, weeks, or even years of operation and which The function enables
can be left for longer periods, without of course sacrificing the quality users to plan the optimal
of the product or process or the safety of the plant and its employees.
Doing this, he says, enables maintenance staff to concentrate their calibration intervals for their
efforts only where they are needed, therefore eliminating unnecessary instruments.
calibration effort and time.
But there are other, perhaps less obvious benefits of looking at the
historical drift over time of a particular sensor or set of measuring
instruments. As Mäki explains: “When an engineer buys a particular
sensor, the supplier provides a technical specification that includes details
on what the maximum drift of that sensor should be over a given time
period. With CMX’s History Trend Option, the engineer can now verify
that the sensor he or she has purchased, actually performed within the
specified tolerance over a certain time period. If it hasn’t, the engineer
now has data to present to the supplier to support his findings.”
But that’s not all. The History Trend function also means that a
plant can now compare the quality or performance of different sensors
from multiple manufacturers in a given location or set of process
conditions. This makes it an invaluable tool for maintenance or quality
personnel who, in setting up a new process line for example, can use
the functionality to compare different sensor types to see which one
best suits the new process.
Calibration software such as CMX can also help with the planning
of calibration operations. Calibration schedules take into account the
accuracy required for a particular sensor and the length of time during
which it has previously been able to maintain that degree of accuracy.
Sensors that are found to be highly stable need not be re-calibrated as
often as sensors that tend to drift.

67
how often should instruments be calibrated

The Benefits of Analyzing Calibration History Trends


The ‘History Trend Option’, which is available as standard in CMX
Enterprise and as an add-on option within CMX Professional, is
basically a utility for viewing calibration history data. It is easy-to-use
and is available both for Positions and Devices. The data is displayed
graphically and is also available in numeric format in a table.
The function enables users to plan the optimal calibration intervals
for their instruments. Once implemented, maintenance personnel,
for example, can analyze an instrument’s drift over a certain time
period. History Trend displays numerically and graphically the
instrument’s drift over a given period. Based on this information, it is
History Trend displays then possible to make decisions and conclusions regarding the optimal
numerically and graphically calibration interval and the quality of the instruments with respect to
measurement performance.
the instrument’s drift over Users already familiar with CMX may confuse this function with
a given period. the standard ‘Calibration Results’ window, but the ‘History Trend’
window enables users to view key figures of several calibration events
simultaneously, allowing to evaluate the calibrations of a Position or a
Device for a longer time period compared to the normal calibration
result view.
For example, the user can get an overview of how a particular device
drifts between calibrations and also whether the drift increases with
time. Also, the engineer can analyze how different devices are suited
for use in a particular area of the plant or process.
Reporting is straightforward and the user can even tailor the reports
to suit his or her individual needs, using the ‘Report Design’ tool.

68
how often should instruments be calibrated

CALIBRATION HISTORY TREND ANALYSIS

The History Trend Option of Beamex® CMX calibration


software allows you to analyze the instrument’s drift over
a certain time period.
• The Beamex® CMX stores every calibration event into the
database; the history trend is made automatically without any
extra manual work.
• The Beamex® CMX also indicates when new devices have
been installed and calibrated. This helps in comparing
differences between devices.
• The graphical display of the history trend helps in visualizing The graphical display
and optimizing the calibration interval for the instruments. of the history trend helps
in visualizing and optimizing
HISTORY TREND USER-INTERFACE

the calibration interval for


the instruments.
HISTORY TREND REPORT

View the results on a history trend report that


includes your company logo.

69
how often should instruments be calibrated

SUMMARY

The Benefits of calibration history trend analysis.


• Analyzing and determining the optimal calibration interval for
instruments
• Conclusions can be made regarding the quality of a particular
measuring instrument
• Time savings: faster analyses is possible when compared to
traditional, manual methods
• Enables engineers to check that the instruments they have
purchased for the plant are performing to their technical
specifications and are not drifting out of tolerance regularly
• Supplier evaluation: the performance and quality of different
sensors from different manufacturers can be compared
quickly and easily.

When calibration frequency can be decreased:


• If the instrument has performed to specification and the drift
has been insignificant compared to its specified tolerance
• If the instrument is deemed to be non-critical or in a low
priority location

When calibration frequency should be increased:


• If the sensor has drifted outside of its specified tolerances
during a given time period
• If the sensor is located in a critical process or area of the
plant and has drifted significantly compared to its specified
tolerance over a given time period
• When measuring a sensor that is located in an area of the
plant that has high economic importance for the plant
• Where costly production downtime may occur as a result of
a ‘faulty’ sensor
• Where a false measurement from a sensor could lead to
inferior quality batches or a safety issue

70
how often should instruments be calibrated

ISO 9001:2000 quality management requirements

7.6 Control of monitoring and measuring devices

The organization shall determine the monitoring and measurement


to be undertaken and the monitoring and measuring devices
needed to provide evidence of conformity of product to
determined requirements.
The organization shall establish processes to ensure that
monitoring and measurement can be carried out and are carried
out in a manner that is consistent with the monitoring and
measurement requirements.
Where necessary to ensure valid results, measuring equipment
shall
a) be calibrated or verified at specified intervals, or prior to use,
against measurement standards traceable to international or
national measurement standards; where no such standards
exist, the basis used for calibration or verification shall be
recorded;
b) be adjusted or re-adjusted as necessary;
c) be identified to enable the calibration status to be
determined;
d) be safeguarded from adjustments that would invalidate the
measurement result;
e) be protected from damage and deterioration during handling,
maintenance and storage.
In addition, the organization shall assess and record the validity
of the previous measuring results when the equipment is found
not to conform to requirements.
The organization shall take appropriate action on the equipment
and any product affected.
Records of the results of calibration and verification shall be
maintained (see 4.2.4).
When used in the monitoring and measurement of specified
requirements, the ability of computer software to satisfy the
intended application shall be confirmed. This shall be undertaken
prior to initial use and reconfirmed as necessary.

71
how often should calibrators be calibrated

How often should


calibrators be calibrated

One frequently asked question is how often a process Uncertainty need is


calibrator should be re-calibrated. one of the most important

A
s a general rule for Beamex’s MC5, starting with a 1-year things to consider when
calibration period is recommended, because the MC5 has a 1-year determining the calibration
uncertainty specified. The calibration period can be changed in
period.
the future, once you begin receiving cumulated stability history, which
is then compared to the uncertainty requirements. In any case, there are
many issues to be considered when deciding a calibrator’s calibration
period, or the calibration period for any type of measuring device. This
article discusses some of the things to be considered when determining
the calibration period, and provides some general guidelines for making
this decision. The guidelines that apply to a calibrator, also apply to
other measuring equipment in the traceability chain. These guidelines
can also be used for process instrumentation.
An important aspect to consider when maintaining a traceable
calibration system is to determine how often the calibration equipment
should be recalibrated. International standards (such as ISO9000,
ISO10012, ISO17025, CFRs by FDA, GMP, etc.) require the use
of documented calibration programs. This means that measuring
equipment should be calibrated traceably at appropriate intervals and
that the basis for the calibration intervals should be evaluated and
documented.
When determining an appropriate calibration period for any
measuring equipment, there are several things to be considered. They
are discussed below.

73
how often should calibrators be calibrated

Uncertainty need
One of the first things to evaluate is the uncertainty need of the
customer for their particular measurement device. Actually, the initial
selection of the measurement device should be also done based on this
evaluation. Uncertainty need is one of the most important things to
consider when determining the calibration period.

Stability history
In critical applications, the When the customer has evaluated his/her needs and purchased suitable
costs of an out-of-tolerance measuring equipment, (s)he should monitor the stability history of the
measuring equipment. The stability history is important criteria when
situation can be extremely deciding upon any changes in the calibration period. Comparing the
high (e.g. pharmaceutical stability history of measuring equipment to the specified limits and
uncertainty needs provides a feasible tool for evaluating the calibration
applications) and therefore
period. Naturally, calibration management software with the history
calibrating the equipment analysis option is a great help in making this type of analysis.
more often is safer.
The cost of recalibration vs. consequences
of an out-of-tolerance situation
Optimizing between recalibration costs and the consequences of an out-
of-tolerance situation is important. In critical applications, the costs of
an out-of-tolerance situation can be extremely high (e.g. pharmaceutical
applications) and therefore calibrating the equipment more often is
safer. However, in some non-critical applications, where the out-of-
tolerance consequences are not serious, calibration can be made less
frequently. Therefore, evaluating of the consequences of an out-of-
tolerance situation is something to be considered. The corrective actions
in such a case should also be made into an operating procedure.
Some measurements in a factory typically have more effect on a
product quality than others, and therefore some measurements are
more acute than others and should be also calibrated more often than
others.

Initial calibration period


When you purchase calibration equipment with which you are not
familiar, you still need to decide the initial calibration period. In this
situation, abiding by the manufacturer’s recommendation is best. For

74
how often should calibrators be calibrated

more acute applications, using a shorter calibration period right from


the beginning is recommended.

Other things to be considered


There are also other issues to be considered when determining
the calibration period, such as the workload of the equipment,
the conditions where the equipment will be used, the amount of
transportation and is the equipment look damaged.
In some cases, crosschecking with other similar measuring
equipment is also feasible for detecting the need for calibration. In some cases,
Crosschecking may be carried out before every measurement in some crosschecking with
acute applications.
Naturally, only appropriate, metrological, responsible personnel other similar measuring
in the company may make changes to the calibration equipment’s equipment is also feasible
calibration period.
for detecting the need
for calibration.

SUMMARY

The main issues to be considered when determining the


calibration period for measuring equipment should include at
least following:

• The uncertainty needs of the measurements


to be done.
• The stability history of the measuring equipment.
• Equipment manufacturer’s recommendations.
• The risk and consequences of an out-of-tolerance situation.
• Acuteness of the measurements.

75
automated calibration planning lowers costs

Automated calibration
planning lowers costs

C
alibration is an essential element of any instrumentation Calibration operations can
maintenance program. However, sometimes calibration be long and tedious, even
operations can be long and time-consuming. By planning the
process and adding the right tools, efficiency can be improved and with the aid of an electronic
costs lowered substantially. calibrator that automates
Accumulated wear and random variations in a sensor’s environment
the tests.
will inevitably reduce its accuracy over time, so periodic testing is
required to guarantee that the measurements being reported actually
match the conditions being monitored. Otherwise, any computerized
monitoring or control systems, to which the sensor is interfaced, will
be unable to detect off-spec conditions and the quality of the product
being manufactured will suffer.
Unfortunately, calibration operations can be long and tedious,
even with the aid of an electronic calibrator that automates the tests.
The sheer volume of data that must be collected and analyzed can be
overwhelming when there are hundreds of sensors to be checked and
multiple data points to be recorded for each.

For instance...
The experience of Croda Chemicals Europe (Nr. Goole, East Yorkshire,
UK) is typical. They use pressurized vessels to purify lanolin for health
care and beauty products. Each vessel needs to be certified at least
once every two years to demonstrate that it is safe and structurally
sound. That includes a functionality check on all of the pressure
instrumentation, as well as on the sensors that monitor the incoming
chemical additives and the outgoing effluent.

77
automated calibration planning lowers costs

Senior Instrument Technician, David Wright, remembers what it


was like to perform all of those calibration operations with paper and
pencil during their regularly scheduled maintenance shut-downs. “It
took us a week to perform the calibrations and a month to put together
the paperwork.”
Today, Croda uses the CMX calibration management software
system from Beamex to coordinate the data collection operations
and archive the results. “It’s faster, easier, and more accurate than our
old paper-based procedures,” says Wright. “It’s saving us around 80
man-hours per maintenance period and should pay for itself in less
than three years.”
CMX runs under the Windows operating system and connects
“It’s faster, easier, and more directly to several kinds of calibrators. It is capable of tracking pre-
accurate than our old defined, customized process instruments and calibration standards
such as pressure, temperature, electrical, indicator, recorders, and mass.
paper-based procedures.” Multidimensional plant hierarchy with uninstalled, installed, and
spare equipment can have multiple functions, multiple procedures,
and work orders, including equipment classification.
Once a calibration task has been performed, CMX records the
calibration history together with timestamps, electronic signatures,
record status, and a complete audit trail. These functions are especially
necessary in regulated industry, such as the pharmaceutical industry,
where routine calibration operations are required to show that quality-
critical instruments continue to perform within the defined tolerances.
The records that are produced must be stored and be retrievable
upon demand to demonstrate to an auditor that the plant is being
maintained to an acceptable level. CMX complies with the new
legislation, concerning electronic records and signatures, defined in
the regulations set forth by the FDA in 21 CFR Part 11.

Calibration planning
Calibration software like CMX can also help with the planning of
calibration operations. Calibration schedules take into account the
accuracy required for a particular sensor and the length of time during
which it has previously been able to maintain that degree of accuracy.
Sensors that are found to be highly stable need not be re-calibrated as
often as sensors that tend to drift.
The trick is determining which sensors should be re-calibrated after
a few hours, weeks, or years of operation and which can be left as-is for

78
automated calibration planning lowers costs

longer periods without sacrificing quality or safety. Doing so allows


maintenance personnel to concentrate their efforts only where needed,
thereby eliminating unnecessary calibration work.
The calibration schedule at Croda is determined by three criteria.
They must first comply with all governmental and insurance
regulations that mandate protection for the plant, its personnel, and
its environment. These are Croda’s top priorities and in some ways the
most expensive, not so much for the direct costs of complying with the
mandated calibration operations, but for the potential cost of failing
to comply. In the UK, as well as in the EU and the US, government
agencies can shut down a plant completely for violating health and
safety regulations, including those for calibration.
Croda also enforces its own in-house safety and quality standards Sensors that are found to
that require certain sensors to be checked every week, every time an be highly stable need not
area of the plant shuts down for maintenance, or every year. The most
frequent calibrations are reserved for critical sensors such as the pH be re-calibrated as often as
meters that measure the acidity of the effluent discharged to the river. sensors that tend to drift.
Wright describes Croda’s third criteria for calibration planning as
“experience of practice”. Management analyzes the history of previous
calibration operations and determines the optimal interval between
calibrations for sensors that do not require regular checks. This analysis
can be performed automatically with calibration software like CMX,
thereby improving the efficiency of creating a calibration schedule
and relieving maintenance personnel of the need to remember when
a particular sensor is due for calibration work.
And by maintaining calibration schedules for all of the sensors in
the plant in one electronic database, calibration software can reduce
the administrative headaches of maintaining individual schedules for
individual machines, processes, and operational zones. Automatic
archiving functions also eliminate the transcription errors common to
hand-written calibration reports and work schedules, saving not only
the time required to fill out a paper report, but the time required to
do it again when mistakes are discovered.

79
automated calibration planning lowers costs

The bottom line

The ROI afforded by an automated calibration planning system will


depend not only on the cost of acquiring it, but on the savings
it provides. Net returns will be greatest under the following
conditions:

• When the plant is highly regulated.

• When current calibration procedures are highly labor-


The ROI afforded by intensive due to a large number of instruments, a large variety
of instruments, a particularly complicated set of calibration
an automated calibration
procedures, or a particularly cumbersome set of paper-based
planning system will reporting procedures.
depend not only on the
• When a large percentage of instruments have discretionary
cost of acquiring it, but calibration intervals (that is, when most instruments do need to
on the savings it provides. be calibrated at fixed intervals for regulatory, safety, or quality
reasons or because access is limited to specific maintenance
periods).

• When the instruments to be calibrated must meet a wide variety


of tolerance, safety, and quality requirements, especially when
some requirements are stricter than others.

• When a large number of plant personnel must coordinate


their efforts either to perform calibration work or to review the
results.

80
automated calibration planning lowers costs

81
benefits of integrating calibration software with cmms

Benefits of integrating calibration


software with CMMS

The benefits of integrating calibration software with a In the process industries,


computerized maintenance management system (CMMS) a small but critical part
of a company’s asset

F
or process manufacturers today, having a reliable, seamlessly management strategy
integrated set of IT systems across the plant, or across
should be the calibration
multiple sites, is critical to business efficiency, profitability
and growth. of process instrumentation.
Maintaining plant assets – whether that includes production line
equipment, boilers, furnaces, special purpose machines, conveyor
systems or hydraulic pumps – is equally critical for these companies.
This is particularly true if the company is part of an asset-intensive
industry, where equipment and plant infrastructure is large, complex
and expensive. Also, if stoppages to production lines due to equipment
breakdowns are costly, implementing the latest computerised
maintenance management (CMM) systems can help save precious
time and money.
In the process industries, a small but critical part of a company’s
asset management strategy should be the calibration of process
instrumentation. For this, Beamex’s calibration management software,
Beamex® CMX, has proved itself time and time again across many
industry sectors, including pharmaceuticals, chemicals, nuclear, metal
processing, paper, oil and gas. Manufacturing plants need to be sure
that their instrumentation products – temperature sensors, pressure
transducers, flow meters and the like – are performing and measuring
to specified tolerances. If sensors drift out of their specification

83
benefits of integrating calibration software with cmms

range, the consequences can be disastrous, perhaps resulting in costly


production downtime, safety issues or batches of inferior quality goods
being produced, which then have to be scrapped.
Beamex® CMX helps companies document, schedule, plan, analyze
and optimize their calibration work. Seamless communication
between CMX and ‘smart’ calibrators means that companies have
the ability to automate pre-defined calibration procedures. As well
as retrieving and storing calibration data, CMX can also download
detailed instructions for operation before and after calibrating. The
most common types of download include procedures, reminders and
safety-related information.
Seamless communication with calibrators also provides many
Today, most process practical benefits such as a reduction in paperwork, elimination of
manufacturers use some human error associated with manual recording, and the ability to
speed up the calibration task. CMX also stores the complete calibration
sort of CMM system history of process instruments and produces fully traceable calibration
that sits alongside their records.
Today, most process manufacturers use some sort of CMM system
calibration management
that sits alongside their calibration management system. Common
system. CMM systems available include SAP, Maximo and Datastream,
otherwise the company may have developed its own, in-house software
for maintenance management.
Whilst Beamex® CMX Calibration Software functions very well
as a standalone calibration management system, customers using
the software in this way are not reaping all the rewards they could
if they were to integrate CMX with their CMM system. The CMM
system is also likely to have been implemented before the calibration
management software and so will normally be the first port of call for
maintenance staff and for generating all work orders.
But the good news for customers of Beamex® CMX Professional
or Beamex® CMX Enterprise software is that CMX can now easily
be integrated to CMM systems, whether it is a Maximo, SAP or
Datastream CMM system. Beamex offers a ‘standard’ integration
package, although most customers will require a customized version
that suits their existing software and maintenance strategy.
Integrating CMX with a CMM system means that plant hierarchy
and all work orders for process instruments can be generated and
maintained in the customer’s CMM system. Calibration work
orders can easily be transferred to CMX Calibration Software. Then,
once the calibration work order has been executed, CMX sends an

84
benefits of integrating calibration software with cmms

Integrating Beamex® CMX Calibration Software with


a maintenance management system (CMMS).

CMMS

The integration project


Integration
interface
normally involves three
XML File XML File parties: the customer, the
CMM system software
Work Order Work Order
Completed partner and Beamex.

CMX
Connector

Beamex® CMX

The integration between Beamex® CMX and computerized maintenance


management system (CMMS) enables exchange of the following data:

From CMMS to CMX: From CMX to CMMS:


•  Position ID •  Work Order Numbers
•  Device ID •  Position ID
•  Location •  Max Error
•  Serial Number •  Passed/Failed information
•  Work Order Numbers •  Calibration Date/Time
•  Calibrated by

85
benefits of integrating calibration software with cmms

acknowledgement order of this work back to the customer’s CMM


system. All detailed calibration results are stored and available on the
CMX database.
Jarmo Hyvärinen, Sales Manager at Beamex comments: “Beamex’s
customers have been asking us for some time whether we can integrate
our CMX software with their maintenance management systems. Our
integration services were introduced recently, and it has generated high
interest among the customers. Many customers are currently going
through the integration process.”
These customers are in the food and beverage, oil and gas, energy, steel
processing and pharmaceuticals industries and, according to Hyvärinen,
use a SAP or Maximo CMM system. “The integration project normally
Once finished, the involves three parties: the customer, the CMM system software partner
integration should save and Beamex. With our customers, Beamex’s part of the integration
has been successful. However, the customer may have a large CMM
these companies time, system and a considerable amount of data keying to perform before the
reduce costs and increase integration is complete. However, once finished, the integration should
save these companies time, reduce costs and increase productivity by
productivity by preventing
preventing unnecessary double effort and re-keying of work orders in
unnecessary double effort separate systems. When there is no need to manually re-key the data,
and re-keying of work typing errors are eliminated. This improves the quality of the entire
system. Integration will also enable these companies to automate their
orders in separate systems. calibration management with smart calibrators.”
Beamex’s standard integration interface uses the XML (Extensible
Markup Language) data file format, which enables the sharing of
structured data across different information systems. Data fields such
as position ID, device ID, location, serial number and work order
number can be transferred from the customer’s CMM system to
CMX.Similarly, data can be transferred the other way, including work
order numbers; position ID; maximum error; pass/fail notifications;
calibration date and time; and who carried out the calibration task.
“Of course, in reality, the customer will require more data fields to
be transferred, but our standard package is the first building block on
the bridge between the two systems,” explains Hyvärinen. “Often, a
data exchange module or interface is required that sits between the
two systems and so the customer’s CMM system specialist will need
to be involved here.”
Although there’s certainly a growing number of manufacturing
companies today, at boardroom level, that are beginning to realise
that maintenance management is now an issue which deserves

86
benefits of integrating calibration software with cmms

enterprise-wide, perhaps multi-site, attention, in general, most plant


or maintenance managers still don’t get a voice in the higher echelons
of the boardroom. Maintenance is simply viewed as a necessary cost
to the business and no more.
But the fact is, any initiatives (including software integration) that
can support an asset management strategy are likely to help save costs
from a company’s balance sheet. Hence, the software community’s
latest buzzword, ‘EAM’, or enterprise asset management. EAM is
more than just maintenance management software. It’s all about
companies taking a business-wide view of all their plant equipment
and co-ordinating maintenance activities and resources with other
departments and sites, particularly with production teams. Integrating
a CMM system with calibration management software is an important The key to success is really
step in the right direction here, particularly if the company has a high the quality of information
volume of process instruments that need calibrating regularly.
Savings from EAM are reasonably well-documented and come you input to the software.
in various guises, the most common benefits being: less equipment
breakdowns (leading to a reduction in overall plant downtime); a
corresponding increase in asset utilisation or plant uptime; better
management of spare parts and equipment stocks; more efficient use
of maintenance staff; and optimised scheduling of maintenance tasks
and resources.
“The key to success is really the quality of information you input to
the software. Part of this relates to the success of the up-front review
process, as well as the ongoing discipline of your maintenance team
that uses the system. The data has to be as close to 100% accurate
as possible to get maximum benefit from the system,” concludes
Hyvärinen.

87
benefits of integrating calibration software with cmms

SUMMARY

The benefits of integration


•  Plant hierarchy and all work orders for process instruments
can be generated and maintained in the customer’s CMM
system.
•  Calibration work orders can easily be transferred to CMX
calibration management software.
•  Companies save time, reduce costs and increase productivity
by preventing unnecessary double effort and re-keying of
work orders in separate systems
•  Integration also enables companies to automate their
calibration management process with ‘smart’ calibrators.

The limitations of using a standalone maintenance


management system
•  Plant hierarchy and work orders can be stored in CMM
system, but the calibration cannot be automated because
the system cannot communicate with ‘smart’ calibrators.
•  Duplicated effort and re-keying the same data into multiple
databases.

Factors driving companies towards integration


•  Seamless integration of IT systems across plants and remote
sites.
•  Sharing of critical plant and process information.
•  Productivity improvement, cost reduction and improving
quality by eliminating manual errors in re-keying data.

88
benefits of integrating calibration software with cmms

89
fieldbus transmitters must also be calibrated

Fieldbus transmitters
must also be calibrated

Fieldbus is becoming more and more common in today’s Fieldbus transmitters


instrumentation. But what is fieldbus and how does must be calibrated as well,
it differ from conventional instrumentation? Fieldbus
but how can it be done?
transmitters must be calibrated as well, but how can it
be done? Until now, no practical solutions have existed
for calibrating fieldbus transmitters, but now Beamex
has introduced the world’s first fieldbus calibrator
– the MC5 Fieldbus Calibrator.

F
ieldbus is becoming more and more common in today’s
instrumentation. But what is fieldbus and how does it differ from
conventional instrumentation? Fieldbus transmitters must be
calibrated as well, but how can it be done? Until now, no practical
solutions have existed for calibrating fieldbus transmitters, but now
Beamex has introduced the world’s first portable fieldbus calibrator –
the MC5 Fieldbus Calibrator.
Conventional transmitters can deliver only one simultaneous
parameter, one way. Each transmitter needs a dedicated pair of cables,
and I/O subsystems are required to convert the analog mA signal to a
digital format for a control system.
Fieldbus transmitters are able to deliver a huge amount of
information via the quick two-way bus. Several transmitters
can be connected to the same pair of wires. Conventional I/O
systems are no longer needed because segment controllers connect the
instrument segments to the quicker, higher-level fieldbus backbone.

91
fieldbus transmitters must also be calibrated

Being an open standard, instruments from any manufacturer can


be connected to the same fieldbus as plug-and-play.

History of fieldbus
Back in the 1940s, instrumentation utilized mainly pneumatic signals
to transfer information from transmitters. During the 1960s, the mA
signal was introduced, making things much easier. In the 1970s,
computerized control systems began to make their arrival. The first
digital, smart transmitter was introduced in the 1980s, using first
proprietary protocols. The first fieldbus was introduced in 1988, and
throughout the 1990s a number of various fieldbuses were developed.
The Foundation Fieldbus During the 1990s, manufacturers battled to see whose fieldbus
and Profibus have begun would be the one most commonly used. A standard was finally set
in the year 2000 when the IEC61158 standard was approved. The
to clearly dominate the Foundation Fieldbus H1 and the Profibus PA, both used in process
fieldbus market. instrumentation, were chosen as standards.
For the most part, one can say that the Foundation Fieldbus is
dominating the North American markets and the Profibus is the
market leader in Europe. Other areas are more divided. There are also
certain applications that prefer certain fieldbus installations despite
the geographical location.

Future of fieldbus
Currently, a large number of fieldbus installations already exist and
the number is increasing at a huge rate. A large portion of new projects
is currently being carried out using fieldbus. Critical applications and
hazardous areas have also begun to adopt fieldbus.
The Foundation Fieldbus and Profibus have begun to clearly
dominate the fieldbus markets. Both Foundation Fieldbus and Profibus
have reached such a large market share that both buses will most likely
remain also in the future. The development of new fieldbuses has
slowed down and it is unlikely that new fieldbus standards will appear
in the near future to challenge the position of Foundation Fieldbus
or Profibus.
Recent co-operation between Foundation Fieldbus and Profibus
suppliers will further strengthen the position of these two standards.

92
fieldbus transmitters must also be calibrated

Fieldbus benefits for industry


Obviously process plants would not start utilizing fieldbus, if it
would not offer them benefits compared to alternative systems.
One important reason is the better return on investment. Although
fieldbus hardware may cost the same as conventional, or even a little
bit more, the total installation costs for a fieldbus factory is far less
than conventional. This is caused by many reasons, such as reduction
in field wiring, lower installation labour cost, less planning/drawing
costs, and no need for conventional I/O subsystems.
Another big advantage is the on-line self-diagnostics that helps in
predictive maintenance and eventually reduces the downtime, offering
maintenance savings. Remote configuration also helps to support Although fieldbus
reduced downtime. The improved system performance is important hardware may cost the
criteria for some plants. There are also other advantages compared to
conventional instrumentation. same as conventional,
or even a little bit more,
Fieldbus transmitters must also be calibrated the total installation costs
The main difference between a fieldbus transmitter for pressure or for a fieldbus factory is
temperature and conventional or HART transmitters is that the output far less than conventional.
signal is a fully digital fieldbus signal.
The other parts of a fieldbus transmitter are mainly comparable
to conventional or HART transmitters. Changing the output
signal does not change the need for periodic calibration. Although
modern fieldbus transmitters have been improved compared to older
transmitter models, it does not eliminate the need for calibration.
There are also many other reasons, such as quality systems and
regulations, that make the periodic calibrations compulsory.

Calibrating fieldbus transmitters


The word “calibration” is often misused in the fieldbus terminology
when comparing it to the meaning of the word in metrology. In fieldbus
terminology, “calibration” is often used to mean the configuration of
a transmitter. In terminology pertaining to metrology, “calibration”
means that you compare the transmitter to a traceable measurement
standard and document the results.
So it is not possible to calibrate a fieldbus transmitter using only
a configurator or configuration software. Also, it is not possible to
calibrate a fieldbus transmitter remotely.

93
fieldbus transmitters must also be calibrated

Fieldbus transmitters are calibrated in very much the same way as


conventional transmitters – you need to place a physical input into
the transmitter and simultaneously read the transmitter output to see
that it is measuring correctly. The input is measured with a traceable
calibrator, but you also need to have a way to read the output of the
fieldbus transmitter. Reading the digital output is not always an easy
thing to do.
When fieldbus is up and running, you can have one person in the
field to provide and measure the transmitter input while another
person is in the control room reading the output. Naturally these two
The MC5 is a compact, people need to communicate with each other in order to perform and
easy-to-use and field document the calibration.
While your fieldbus and process automation systems are idle, you
compatible calibration need to find other ways to read the transmitter’s output. In some cases
solution offering also you can use a portable fieldbus communicator or a laptop computer
with dedicated software and hardware.
a lot of other functionality.
In many cases, calibrating a fieldbus transmitter can be cumbersome,
time-consuming and may require an abundance of resources.
Until now, no practical way to calibrate fieldbus transmitters has
existed.

MC5 Fieldbus Calibrator


Beamex has introduced a revolutionary MC5 Fieldbus Calibrator,
which is a combination of a multifunction process calibrator and a
fieldbus configurator. The MC5 can be used for calibrating Foundation
Fieldbus H1 or Profibus PA transmitters.
With the MC5 Fieldbus Calibrator you can calibrate a fieldbus
pressure and temperature transmitter, as the MC5 can simultaneously
generate/measure the transmitter input and also read the digital
fieldbus output of the transmitter. The MC5 can also be used to
change the configurations of a fieldbus transmitter. If you find that
the fieldbus transmitter fails in calibration, you can also use the
MC5 to trim/adjust the fieldbus transmitter to measure correctly.
Being a documenting calibrator, MC5 will automatically document
the calibration results of a fieldbus transmitter in its memory, from
where the results can be uploaded to calibration management software.
This eliminates the time-consuming and error-prone need for manual
documenting using traditional methods.
The MC5 is a compact, easy-to-use and field compatible calibration
solution offering also a lot of other functionality.

94
fieldbus transmitters must also be calibrated

The MC5 can be used for calibrating Foundation Fieldbus H1 or


Profibus PA transmitters.

Main advantages of the MC5 Fieldbus Calibrator


The most important advantage of the MC5 Fieldbus Calibrator is the
possibility to calibrate, configure and trim the Foundation Fieldbus
H1 or the Profibus PA transmitters using a single unit. Because it is a
combination of a calibrator and a fieldbus configurator, the MC5 is
able to perform traceable calibration on fieldbus transmitters. Fieldbus
configurators and configuration software are not able to do this; they The MC5 can be used
can only be used to read/change configurations. Calibration can for calibrating Foundation
therefore be performed by one person instead of two.
The MC5 is able to calibrate stand-alone transmitters, or when they Fieldbus H1 and Profibus
are connected to a live fieldbus as with the Foundation Fieldbus. There PA transmitters.
is no need for a separate power supply because the MC5 includes
an integral power supply for powering up a stand-alone transmitter
during calibration. Therefore, the MC5 can also be used during
commissioning when the fieldbus and control systems are still idle.
When new fieldbus transmitter models appear on the market,
customers can easily update their MC5s by entering device descriptions
into them. Therefore, the MC5 is never obsolete.
Beamex’s MC5 has already been on the market for a few years now, but
even the oldest MC5s can be upgraded to a MC5 Fieldbus Calibrator.
Therefore, an MC5 owner may not necessarily have to buy a new unit.
All of the above-mentioned features assist the MC5 Fieldbus
Calibrator in saving much time and money during commissioning and
maintenance in a fieldbus plant. Being a multifunctional calibrator,
the MC5 can be used for various other jobs as well.

General functions of the MC5


Beamex’s MC5 is an all-in-one accurate documenting multifunction
process calibrator for calibrating pressure, temperature, electrical
and frequency signals. The MC5 also supports HART protocol
and can calibrate, configure and trim HART transmitters. The
modular construction of MC5 provides flexibility for user-specific
requirements.
The most important features of an MC5 include accuracy, versatility,
communication with calibration software, field compatibility (IP65)
and modularity.

95
calibration of weighing instruments  part 1

Calibration of weighing instruments


Part 1

From the point of view of the owner, weighing In any case, it is the
instruments, usually called scales or balances, owner or the user of
should provide the correct weighing results. How
the instrument that carries
the weighing instrument is used and how reliable
the weighing results are can be very different. Using the final responsibility of
weighing instruments for legal purposes must have measurement capability
legal verification. and who is also responsible
for the processes involved.

I
f a weighing instrument is used in a quality system, the user must
define the measurement capability of it. In any case, it is the owner
or the user of the instrument that carries the final responsibility of
measurement capability and who is also responsible for the processes
involved. (S)He must select the weighing instrument and maintenance
procedure to be used to reach the required measurement capability.
From a regulatory point of view, the quality of a weighing instrument
is already defined in OIML regulations, at least in Europe. Calibration
is a means for the user to obtain evidence of the quality of weighing
results, and the user must have the knowledge to apply the information
achieved through calibration.

Calibration and legal verification


Weighing instruments may also possess special features. One of these
features includes making measurements for which legal verification is
required, for example when invoicing is based on the weight of a solid

97
calibration of weighing instruments  part 1

material. The features may vary slightly from country to country, but
in the EU they are the same, at least at the stage when the weighing
instrument is being introduced into use.
Verification and calibration abide by a different philosophy.
Calibration depicts the deviation between indication and reference
(standard) including tolerance, whereas verification depicts the
maximum permissible amount of errors of the indication. This is a
feasible practice for all weighing. The practical work for both methods
is very similar and both methods can be used to confirm measurement
capability, as long as legal verification is not needed. The terminology
and practices used previously for verifying measurement capability,
and for weighing technology in general, are based on these practices of
We must remember calibrating and verifying, even if it was a question of general weighing
that the quality of the (non-legal).

evaluation of measuring
Confirmation is the collecting of information
tolerance depends on
Confirming the capability of weighing instruments should happen
the collected information
by estimating the quality of the measuring device in the place where
through calibration. it will be used. In practice, this means investigating the efficiency of
the weighing instrument; this operation is known as calibration (or
verification). One calibration provides information on a temporary
basis and a series of calibrations provides time-dependent information.
The method of calibration should be selected such that it provides
sufficient information for evaluating the required measuring tolerance.
The method should be precise for achieving comparable results during
all calibrations.
Comparing the indication of weighing instruments with a set
standard gives the deviation or error. However, to be able to define the
measuring tolerance, we need more information about the weighing
instrument, such as repeatability, eccentric load, hysteresis, etc. We
must remember that the quality of the evaluation of measuring
tolerance depends on the collected information through calibration.
Using a calibration program, which goes through the same steps for
every calibration – calculates deviation and measuring tolerance, and, if
necessary, produces a calibration certificate – is the best way to achieve
reliable information to use in comparisons. This type of program
is able to store all the history of calibrated weighing instruments,
including information for other measuring devices. It is also handy
for monitoring measuring systems. The most important aspect of a

98
calibration of weighing instruments  part 1

calibration program is that it allows the user to select the calibration


method that corresponds to the required level of measuring tolerance,
and it displays the history of calibrations and in this way provides
the user with comprehensive information concerning measuring
capability.

The purpose of calibration and complete confirmation


Calibration is a process where the user is able to confirm the correct
function of the weighing instrument based on selected information.
The user must define the limits for permitted deviation from a true
value and required measuring tolerance. If these values are exceeded,
an adjustment or maintenance is necessary. Calibration itself, however,
Calibration itself, however, is a short-term process; the idea is that is a short-term process;
the weighing instrument remains in good working condition until the
next calibration. For this reason, the user must determine all of the the idea is that the weighing
external factors which may influence the proper functioning of the instrument remains in good
weighing instrument. The factors in question may include the effect
working condition until
of the environment where the weighing instrument is used and how
often the instrument needs to be cleaned, regular monitoring of the the next calibration.
zero point and the indication number with a constant mass.
Today, the function of weighing instruments, as well as many
other instruments, is based on microprocessors. They possess several
possibilities for adjusting parameters in measuring procedures.
Calibration should be carried out using settings based on the
parameters for normal use. It is very important that the users of the
weighing instruments, as well as calibration personnel, are familiar
with these parameters and use them as protocol. Since there are several
parameters in use, it is important to always have the manual for using
the weighing instrument easily available to the user.

The content of the calibration certificate


Very often the calibration certificate is put on file as evidence of a
performed calibration to await the auditing of the quality system.
However, a quality system is usually concerned with the traceability
of measurements and the known measuring tolerance of the
measurements made. The calibration certificate of a single measuring
device is used as a tool for evaluating the process of measuring tolerance
and for displaying the traceability of the device in question.

99
calibration of weighing instruments  part 1

Performing calibrations based on the measuring tolerance is better


than doing routine measuring. Therefore, the user must evaluate
the process of measuring tolerance and compare this value with the
required measuring tolerance of the process.

SUMMARY

Calibration (or verification) is a fundamental tool for


maintaining a measuring system. It also assists the user in
obtaining the required quality of measurements in a process.
The following must be taken into consideration:

•  the type of procedure to be applied in confirming measuring


tolerance
•  the interpretation of information while abiding by the
calibration certificate
•  changing procedures based on received information

Quality calibration methods and data handling systems offer


state-of-the-art possibilities to any company.

aimo pusa
lahti precision
finland

100
calibration of weighing instruments  part 1

101
calibration of weighing instruments  part 11

Calibration of weighing instruments


Part 2

Any scale is capable of producing incorrect results.

W
eighing is a common form of measurement in commerce, Weighing instruments are
industries and households. Weighing instruments are often highly accurate, but
often highly accurate, but users, i.e. their customers and/or
regulatory bodies, often need to know just how inaccurate a particular users, i.e. their customers
scale may be. Originally, this information was obtained by classifying and/or regulatory bodies,
and verifying the equipment for type approval. Subsequently, the
often need to know just
equipment was tested or calibrated on a regular basis.
how inaccurate a particular
Typical calibration procedures scale may be.
Calibrating scales involves several different procedures depending
on national- and/or industry-specific guidelines or regulations, or on
the potential consequences of erroneous weighing results. One clear
and thorough guide is the EA-10/18, Guidelines on the Calibration
of Non-automatic Weighing Instruments, which was prepared by
the European Co-operation for Accreditation, and published by the
European Collaboration in Measurement and Standards (euromet).
Typical scale calibration involves weighing various standard weights
in three separate tests:
•  repeatability test
•  eccentricity test
•  weighing test (test for errors of indication)
In the pharmaceutical industry in the United States, tests for
determining minimum weighing capability are also performed.

103
calibration of weighing instruments  part 11

Repeated weighing measurements provide different indications


Usually, the object being weighed is placed on the load receptor
and the weighing result is read only once. If you weigh the object
repeatedly, you will notice slight, random variation in the indications.
The Repeatability Test involves weighing an object several times to
determine the repeatability of the scale used.

Center of gravity matters


The Eccentricity Test involves placing the object being weighed in the
middle of the load receptor as accurately as possible. This is sometimes
Weighing instruments are difficult due to the shape or construction of the object being weighed.
often highly accurate, but Typical calibration procedures include the Eccentricity Test. You
can determine how much the eccentricity of the load will affect the
users, i.e. their customers indication on the scale by weighing the same weight at the corners of
and/or regulatory bodies, the load receptor.
often need to know just
how inaccurate a particular Test for errors in indication

scale may be. The Weighing Test examines the error of the indication on the scale
for several predefined loads. This enables you to correct the errors and
definitions for non-linearity and hysteresis.
If the scale’s maximum load limit is extremely large, it may be
impractical to use standard weights for calibrating the entire range.
In such a case, suitable substitution mass is used instead. Substitution
mass should also be used if the construction of the scale does not allow
the use of standard weights.

A truck scale is unsuitable for weighing letters


The purpose of the Minimum Weight Test is to determine the
minimum weight, which can be assuredly and accurately measured
using the scale in question. This condition is met if the measurement
error is less than 0.1% of the weight, with a probability of 99.73%.

Combined standard uncertainty of the error U(E)


Knowing the error of the scale indication at the point of each
calibration is not sufficient. You must also know how certain you can

104
calibration of weighing instruments  part 11

be about the error found at each point of calibration. There are several
sources of uncertainty of the error, e.g.:

• The masses of the weights are only known with a certain


uncertainty.
• Air convection causes extra force on the load receptor.
• Air buoyancy around the weights varies according to barometric
pressure, air temperature and humidity.
• A substitute load is used in calibrating the scale.
• Digital scale indications are rounded to the resolution in use.
• Analogous scales have limited readability.
• There are random variations in the indications as can be seen in the
Repeatability Test.
• The weights are not in the exact middle of the load receptor.

The values of uncertainty determined at each point of calibration are


expressed as standard uncertainties (coverage probability: 68.27%),
which correspond to one standard deviation of a normally distributed
variable. The combined standard uncertainty of the error at a certain
point of calibration has a coverage probability of 68.27% as well.

1.8 g 3.2 g

1.1 g 3.9 g
0.4 g 4.6 g
2.5 g
Example: The calibration error
and its uncertainty at the
±u(E)
calibration point of 10 kg may
68.27%
be expressed e.g. E = 2.5 g
and u(E) = ±0.7 g, which means
U(E) = 2u(E)
95.45 % that the calculated error in the
indication is 2.5 g and the actual
U(E) = 3u(E) error, with a coverage probability
99.73 % of 68.27%, is between is between
1.8 g and 3.2 g.

105
calibration of weighing instruments  part 11

Expanded uncertainty in calibration U(E)


In practice, a coverage probability of 68.27% is insufficient. Normally,
it is extended to a level of 95.45% by multiplying it with the coverage
factor k = 2. If the distribution of the indicated error cannot be
considered normal, or the reliability of the standard uncertainty value
is insufficient, then a larger value should be used for the k-factor.
If you are able to use the k = 2 coverage factor, then the error and
its extended uncertainty at the point of calibration are E = 2.5 g and
U(E) = ±1.4 g. This means that the calculated error of the indication
is 2.5 g and the actual error, with a coverage probability of 95.45%, is
The purpose of calibration between 1.1 g and 3.9 g.
is to determine how
accurate a weighing Uncertainty of a weighing result

instrument is. The purpose of calibration is to determine how accurate a weighing


instrument is. As the above-mentioned case indicates, you know that
if you repeat the calibration several times, the indication of weighing
an object of 10 kilograms will be between 10.0011 kg and 10.0039 kg
95.45% of the time.However, the uncertainty of the results of later
routine weighings is usually larger. Typical reasons for this are:
• Routine weighing measurements involve random loads, while
calibration is made at certain calibration points.
• Routine weighing measurements are not repeated whereas indications
received through calibrations may be averages of repeated weighing
measurements.
• Finer resolution is often used in calibration.
• Loading/unloading cycles in calibration and routine weighing may
be different.
• A load may be situated eccentrically in routine weighing.
• Tare balancing device may be used in routine weighing.
• The temperature, barometric pressure and relative humidity of the
air may vary.
• The adjustment of the weighing instrument may have changed.

Standard and expanded uncertainties of weighing results are calculated


using technical data of the weighing instrument, its calibration results,
knowledge of its typical behaviour and knowledge of the conditions of
the location where the instrument is used. Defining the uncertainty
of weighing results is highly recommended, at least once, for all

106
calibration of weighing instruments  part 11

typical applications and always for critical applications. Calculating


the uncertainty of weighing results assists you in deciding whether
or not the accuracy of the weighing instrument is sufficient and how
often it should be calibrated. However, determining the uncertainty
of weighing results is not part of calibration.

Calibrating and testing weighing instruments using CMX


CMX’s scale calibration enables you to uniquely configure calibration
and test each weighing instrument. Correspondingly, copying
configurations from one scale to another is easy. Error limits can be set
according to OIML or Handbook-44. Wide variation in user-specific CMX’s scale calibration
limits is also possible. enables you to uniquely
CMX calculates combined standard uncertainty and expanded
uncertainty at calibration of the weighing instrument. It allows you configure calibration
to enter additional, user-defined uncertainty components in addition and test each weighing
to supported uncertainty components. CMX’s versatile calibration
instrument.
certificate and possibility to define a user specific certificate assure
that you can fulfill requirements set for your calibration certificates.

107
calibration in hazardous environments

Calibration in
hazardous environments

Combustible vapors require special equipment

S
triking a match in an environment that contains combustible gas is Many materials and fluids
nothing short of dangerous – personal injury and property damage used in seemingly “safe”
are likely consequences. Improperly calibrating an instrument in
this hazardous environment can be almost as dangerous. industries are themselves
The materials and fluids used in some processes can be hazardous in flammable.
the sense that they can ignite or explode. For example, hydrocarbons
in mines, oil refineries, and chemical plants are flammable and are
typically contained within vessels and pipes. If this were truly the case,
an external flame would not ignite the hydrocarbons. However, in
many locations, leaks, abnormal conditions, and fluid accumulation
may allow hydrocarbons to be present such that the flame could ignite
the hydrocarbons with disastrous results.
Hydrocarbons and other flammable fluids are not limited to the
petroleum and chemical industries. For example, combustible fuels,
such as natural gas, are used in all industries, including agriculture,
food, pharmaceuticals, power generation, pulp/paper, water/
wastewater, universities, retail, and in the home.
In addition, many materials and fluids used in seemingly “safe”
industries are themselves flammable. Even seemingly safe water
treatment systems use combustible materials such as chlorine in their
processes. This means that certain areas of a water treatment plant
may well be considered hazardous. Similarly, certain areas of food
plants, such as reactors that hydrogenate oils, may pose hazards as
well. Therefore, it is important for plants to examine their processes

109
calibration in hazardous environments

and identify hazardous locations so that the proper instruments are


selected, installed, and maintained in accordance with practices that
are appropriate for the hazard.

Equipment Requirements in Hazardous Locations


Protection requirements for hazardous locations vary according to the
type of material present, frequency of the hazard, and the protection
concept applied.
The intensity with which various vapors can combust is generally
different. Groupings (IEC 60079-10) in order of decreasing ignition
energy (with an example of a gas in the group) are:
Intrinsic Safety (IS)
is the most common Group IIC Acetylene
protection concept Group IIB+H2 Hydrogen
Group IIB Ethylene
applied to calibrators Group IIA Propane
used in hazardous
locations. The hazardous area classifications (IEC 60079-10) in order of
decreasing frequency are:

Zone 0 Flammable material present continuously


Zone 1 Flammable material present intermittently
Zone 2 Flammable material present abnormally

Intrinsic Safety (IS) is the most common protection concept applied


to calibrators that are used in hazardous locations. In general, the IS
concept is to design the calibrator such that it limits the amount of
energy available such that it cannot ignite a combustible gas mixture.
Adding the applicability of IS designs to various hazards in the
previous table yields:

Zone 0 ia Flammable material present continuously


Zone 1 ia, ib Flammable material present intermittently
Zone 2 ia, ib Flammable material present abnormally

110
calibration in hazardous environments

In addition, a hot surface temperature on a device can cause ignition.


Temperature classes limit the maximum surface temperature between
450˚C (T1) and 85˚C (T6).
Beamex calibrators for hazardous locations are designed and certified
for Ex ia IIC T4 hazards per the ATEX Directive and are applicable to
all vapor hazards where a temperature class of 135˚C in a 50˚C ambient
is acceptable. As such, they can be used for the overwhelming majority
of applications where a vapor hazard is present.

Calibration Solutions for Hazardous Locations


Instruments designed to measure flow, level, pressure, temperature,
and other variables designed in hazardous locations are generally used The Beamex CMX
to monitor and control the process. In some applications, it is practical software integrates
to remove these instruments and calibrate them on the workshop with
a calibration test bench. This is usually not the case, which means that calibration management
many instruments are calibrated in the field. Fortunately, there are by allowing efficient
calibrators that are specifically designed to operate safely in rugged
planning and scheduling
environments and hazardous locations.
The Beamex MC5-IS is a portable, intrinsically safe, multifunctional of calibration work.
calibrator that has modules which can accommodate wide ranges and
many types of pressure, RTD, thermocouple, voltage, current, pulse,
and frequency measurements.
The Beamex MCS100 modular calibration system is a test bench and
calibration system for workshops and laboratories that incorporates
the functionality of the MC5 multifunction calibrator and can
measure/generate additional parameters such as precision pressures.
The ergonomic design and modular construction allow the user to
select the necessary functions in a cost-effective manner.
The Beamex CMX software integrates calibration management by
allowing efficient planning and scheduling of calibration work. It not
only alerts you when to calibrate, but also automatically takes data,
creates documentation, adheres to cGMP regulations (21 CFR 11), and
tracks calibration history. This software generally makes calibration
work faster and easier and is designed to integrate into management
systems such as SAP/R3 and Maximo.

111
calibration in hazardous environments

A few points to remember


• Improper actions in hazardous locations can result in property
damage and bodily injury.

• Hazardous locations can exist in virtually all industries, stores, and


 azardous locations can
H in the home.
exist in virtually all industries,
• Instruments should be specified, installed, operated, and maintained
stores, and in the home. in accordance with requirements for the hazardous location.

• Portable Beamex calibrators for hazardous locations are designed to


be used in virtually all vapor hazards.

112
calibration in hazardous environments

113
improving power plant performance through calibration

Improving power plant


performance through calibration

Calibration helps a power plant in maintaining or even Reducing the measure­


improving safety, as well as in meeting national and ment uncertainty (with
international standards. However, calibration is also a
high-accuracy calibrators)
matter of profitability. By using high-accuracy calibration
equipment, the accuracy of vital measurements can in a nuclear power plant
be maintained on a required level and the plant can can potentially increase
increase its annual power production capability. electrical production
up to nearly 2 percent.
Power and Energy Operations
Electric, natural gas, oil, and other energy meters directly influence
revenue. Because of rising energy costs, power and energy operations
carefully scrutinize these devices to ensure that they accurately
account for their energy and bill correctly. Calibration issues that arise
in the power and energy industries are also similar to those faced in
power and energy consumer industries. Although not directly billable,
internal energy consumption and its allocation can make or break a
seemingly viable product. Proper instrument calibration helps ensure
that these measurements are performed accurately, especially because
annual billing can exceed US$ 1 billion in many plants.

Calibration Requirements for Increasing


Plant Productivity and Safety
Accurate measurements can allow increased energy production.
Therefore, high-accuracy calibrators such as the Beamex® MC5

115
improving power plant performance through calibration

Multifunction Calibrator, have a significant role in improving plant


productivity. For instance, reducing the measurement uncertainty in
a nuclear power plant can potentially increase electrical production
up to nearly 2 percent (see CNA case story).
The economic consequence of this seemingly small power increase
typically increases revenues by many millions of dollars because the
total value of power and energy flows can easily be over a billion
dollars per year. Therefore, seemingly small measurement errors caused
by poor calibration techniques can potentially result in major revenue
losses. With these improved measurements comes the additional
requirement of maintaining calibration equipment and techniques
Heat and material balances that are comparable with the improved instrument performance with
can be instrumental in traceability to national and international standards, such as NIST, ISO
9001 and ISO 17025.
locating opportunities that Safety is based upon never exceeding established operating limits
can save millions of dollars such as reactor power and cooling capacity. A byproduct of improved
calibration is an improvement in safety and fewer problems because
of energy.
instruments periodically calibrated to more accurate standards reduce
the risk associated with these measurements. In addition, improved
calibration standards can be used to detect instrument degradation
sooner.

Power and Energy Industry Calibration


Field calibration allows the in-situ calibration of instruments that
measure electrical parameters such as voltage, current and power –
some of which may be used for billing purposes in a power plant.
The performance of natural gas and oil flow measurement systems
can also influence billing. In particular, the calibration of flowmeters
used to check the custody transfer flowmeters is important. Small
variances between these flowmeters can result large billing differences
and could indicate a problem with the custody transfer flowmeter.
Field calibrators allow these calibrations to be performed accurately
and more efficiently, especially when the instrument is installed with
poor personnel access.
Many electricity, steam, cogeneration, ethanol, bio-diesel, refinery
and other types of energy plants use these measurements to develop
process heat and material balances that describe their processes. Heat
and material balances can be instrumental in locating opportunities
that can save millions of dollars of energy. For example, increased steam

116
improving power plant performance through calibration

CASE: Central Nuclear de Almaraz (CNA), Spain

Increasing annual production


with high-accuracy calibrators
Enhanced calibration equipment performance makes it possible
to perform calibration operations with better uncertainty
levels, with which it is possible to improve production results.
This is achieved by using the Beamex® MC5 high-accuracy
multifunction calibrators for calibration in the power plant.
By improving the measurement of the parameters from 2% to The Beamex® MC5
0.4% (parameters associated with the determination of reactor
multifunction calibrators
power), enabled the power in each unit to be increased by 1.6%.
This has a very significant effect on the annual production. and the Beamex® CMX
calibration software form
Read the case story at: www.beamex.com/success an integrated, automated
calibration system.

flow to a heat exchanger indicates that either the steam trap on the heat
exchanger leaks (wasting steam), or that the process changed (wasting
steam) and needs to be investigated to reduce steam consumption.
Field calibrators help ensure that these instruments operate properly
and accurately quantify energy savings. In addition, instruments that
are regularly and accurately calibrated often tend to improve plant
safety and reduce the probability of equipment damage.
Further, automated portable calibrators can improve the calibration
process by automating the generation of the transmitter inputs and the
recording of the transmitter measurements. This makes the calibration
process much less time-consuming and improves the accuracy with
which data is collected by reducing the probability of human error.
Expensive qualification requirements often preclude the opening
of transmitters in nuclear power plants. In these applications, the
calibration process can be performed faster and more accurately
with an automated calibrator as compared to using existing manual
calibration techniques.
The Beamex® MC5 multifunction calibrators and the Beamex®
CMX calibration software form an integrated, automated calibration
system. Calibrations performed using automated field calibrators

117
improving power plant performance through calibration

and calibration software with electronic documentation, result in


more uniform calibrations that are less prone to human error. This
automated calibration process is faster and more complete than the
manual process. Aside from being more accurate, it also frees up
significant amounts of time for the technician to perform other work.
The integrated system with communicating calibrators and calibration
software allow also easy upload of calibrations to a PC and produce
easy-to-read calibration certificates showing the accuracy of the
instrument. It also allows the ability to search for instruments that
are due for calibration.

An optimal calibration plan


SUMMARY
executed well can improve
both safety and productivity
Improving power plant performance
of the plant. through calibration.
An optimal calibration plan executed well can improve both
safety and productivity of the plant.

Calibration improves safety


•  Calibration is carried out to compare the quality and reliability
of the measurements by means of traceability through
national and international standards (e.g. NIST, ISO 9001,
ISO 17025).
•  Plan safety is based on never exceeding the established
operating parameters (e.g. reactor power, cooling capacity).

Calibration improves efficiency


•  Enhanced calibration equipment performance results in better
uncertainty levels, with which it is possible to improve annual
production levels.
•  With an integrated calibration solution, such as the Beamex®
calibrators and calibration software, plant can reduce the
amount of paperwork and equipment that has to be taken
out in the field.

118
improving power plant performance through calibration

119
calibration in the pharmaceutical industry

Calibration in the
pharmaceutical industry

Pharmaceutical Operations

T
he pharmaceutical industry produces products that directly The pharmaceutical
affect the life of the majority of the billions of people that inhabit industry produces products
the Earth. As such, a seemingly small mistake or failure could
adversely affect the health of thousands of people. Regulators in the that directly affect the life
pharmaceutical industry recognize these stakes and have implemented of the majority of the billions
various regulations to ensure the integrity of pharmaceutical processes
of people that inhabit the
and, hence, the safety and efficacy of the pharmaceutical products on
which billions of people rely. Earth.
Individuals who are not directly associated with the pharmaceutical
industry should take note because some aspects of these regulations
are being adopted in the process industries. Therefore, it would be
pragmatic to use equipment and adopt practices that either meet or
can easily be upgraded to meet pharmaceutical requirements.
Some high-volume pharmaceuticals are often manufactured
using continuous processing techniques; however, pharmaceutical
manufacturing is typically performed in batches. As such, these
processes typically incorporate many pressure and temperature
measurements such as local indicators (gauges), transmitters and
switches. Many of these measurements are at extreme conditions such
as may be found in an autoclave. While there may be some flowmeters,
batch processes typically incorporate weighing instruments to
implement material additions. Some processes involve clean rooms
where the measurement of low differential pressures is important.
Process measurements can be critical to ensure product quality.

121
calibration in the pharmaceutical industry

Calibrating instruments properly in a timely manner is an important


aspect of ensuring that a pharmaceutical product is manufactured
properly. Calibration documentation can be used to verify that
calibrations have been performed properly prior to producing products.
Should there be a problem, this information may prove to be a key
factor in determining when, where and/or how an error was made.
Therefore, governing bodies tend to regulate the type of information
collected and the time interval between calibrations.

Manual Documentation of Pharmaceutical Calibrations


Whereas specific regulations may vary somewhat around the world, the
Calibration must be underlying design premise behind calibration requirements is to ensure
performed according that instrument calibrations are performed correctly. Traditionally, this
meant generating a paper trail of calibration information including its
to approved written time, date, test equipment, as-found data, as-left data, and the like.
procedures and the However, this is only the “tip of the iceberg”. Calibration must
be performed according to approved written procedures and the
calibration records
calibration records must be maintained for a certain period of time.
must be maintained for Each instrument should have a master history record, a unique
a certain period of time. identity, calibration period, and calibration error limits. Product,
process and safety instruments require that they be physically tagged
and sometimes color-coded. The performance of calibration standards
should be more accurate than the instrument being calibrated. The
calibration of the calibration standard must be documented and
performed periodically. Using calibration standards that are traceable
to national and international standards is required. Additional
documentation pertaining to the technician and his/her qualifications
and certifications to perform calibrations would also be needed to
be able to demonstrate that the individual has been trained and is
qualified and competent to perform the calibrations.
Calibration must be performed according to approved written
procedures and the calibration records must be maintained for a
certain period of time.
In most countries, these instrumentation requirements must
be implemented within in the general context stipulated by 21
CFR Part 211 (Current Good Manufacturing Practice for Finished
Pharmaceuticals). Among many other requirements, all instruments
must be fit for their purpose and a documented change management
system must be in place.

122
calibration in the pharmaceutical industry

Pharmaceutical companies complied with these requirements by


writing calibration information on paper forms that they developed
for this purpose. Despite conformance with the regulations, paper
documentation was both cumbersome and prone to human error.
Simply put, it is not difficult to make mistakes when calculations are
performed manually and most dates, times, measurements, calculation
results, and the like are written by hand.

Electronic Documentation of Pharmaceutical Calibrations


Electronic documentation systems that do not require any paper were
developed to overcome these disadvantages and reduce the amount of
time technicians spend in complying with documentation regulations. Fortunately, there are
However, electronic records do not inherently contain signatures that calibration systems that
identify the person performing a calibration. Therefore, 21 CFR Part
11 (Electronic Records; Electronic Signatures) addresses the additional are specifically designed
issue of ensuring the documentation of these people. to operate safely in
Electronic calibration systems for the pharmaceutical industry that
rugged environments
conform to the 21 CFR Parts 11 and 211, such as Beamex’s CMX
calibration software and MC5 Multifunction Calibrators, can be and hazardous locations.
integrated to provide automated documentation with less human
intervention. This results in fewer human errors, improved work
quality, and improved efficiency that can directly affect profit.
Moreover, locating the original electronic records in one database
can not only reduce paper records into traceable electronic records
with a history of change management, but can also turn the calibration
system into a powerful repository of decision-making history that can
be used to improve calibration procedures. Versatile security settings
and multilevel user accounts help to ensure the security and integrity of
the system and track authorized and unauthorized database actions.

Calibration Solutions for the Pharmaceutical Industry


Instruments designed to measure flow, level, pressure, temperature,
and other variables are generally used to monitor and control
pharmaceutical processes. In some applications, it is practical to
remove these instruments and calibrate them on the bench. This is
generally not the case so many instruments are calibrated in the field.
Fortunately, there are calibration systems that are specifically designed
to operate safely in rugged environments and hazardous locations.

123
calibration in the pharmaceutical industry

A typical calibration process is illustrated in the figure below where


the calibration master schedule identifies an instrument that needs
to be calibrated and downloads the appropriate calibration data into
a technician’s handheld calibrator. After entering his/her username
and password, the technician performs an automated “As Found”
calibration. If the instrument does not pass calibration (as determined
by the downloaded information in the calibrator), the technician
can calibrate the instrument and perform an automated “As Left”
calibration. Data from both calibrations are stored in the handheld
calibrator to be uploaded to the database where it is documented.
The Beamex CMX software integrates calibration management by
allowing efficient planning and scheduling of calibration work. It not
only alerts you when to calibrate, but also automatically takes data,
creates documentation, adheres to cGMP regulations (21 CFR Parts
Calibration process in a 11 and 211), and tracks calibration history. This software generally
pharmaceutical company with makes calibration work faster and easier and is designed to integrate
the Beamex calibration system. into management systems such as SAP/R3 and Maximo.

MC5 / PDA / MC5 / PDA /
Manual Entry Manual Entry

Calibration YES Print calibration


Check Out OK? Check In
As Found certificate / sticker

NO YES

Adjustment Calibration
OK?
As Left

NO

Check In Deviation Report

MC5 / PDA /
Manual Entry

124
calibration in the pharmaceutical industry

The Beamex MC5 is a portable multifunction calibrator that has


modules that can accommodate wide ranges and many types of
pressure, RTD, thermocouple, voltage, current, pulse, and frequency
measurements.
The Beamex MCS100 modular calibration system is a test bench and
calibration system for workshops and laboratories, which incorporates CMX’s change management
the functionality of the MC5 and can measure/generate additional complies to FDA requirements
parameters such as precision pressures. The ergonomic design and (21 CFR Part 11 Electronic
modular construction allow the user to select the necessary calibration Records and Electronic
functions in a cost-effective manner. Signatures).

125
calibration in the pharmaceutical industry

SUMMARY

Here are a few points to remember:

• Automated electronic calibration and electronic documentation


save time and are less prone to human error than are manual
calibration and paper documentation.

• Electronic documentation allows all calibration information to


be located in one database for easy access and use.

Proper calibration and • Proper calibration and its documentation are important for
maintaining the safety and efficacy of pharmaceuticals.
its documentation are
important for maintaining • Portable Beamex calibrators for hazardous locations are
the safety and efficacy designed for use in virtually all pharmaceutical plants.

of pharmaceuticals.
• Beamex software follows the guidelines of 21 CFR Parts 11
and 211 and makes calibration and its documentation easy.

Beamex products for the pharmaceutical industry:

• MC5 Multifunction Calibrator


• MC5-IS Intrinsically Safe Multifunction Calibrator
• CMX Calibration Management Software

Beamex services for the pharmaceutical industry:

• Installation and training


• Validation services (CMX calibration management software)

126
calibration in the pharmaceutical industry

127
128
the benefits of using a documenting calibrator

The benefits of using


a documenting calibrator

Why calibrate?
For process manufacturers, regular calibration of instruments across The key final step in
a manufacturing plant is common practice. In plant areas where
any calibration process
instrument accuracy is critical to product quality or safety, calibration
every six months – or even more frequently – is not unusual. – documentation – is often
However, the key final step in any calibration process – neglected or overlooked
documentation – is often neglected or overlooked because of a lack
of resources, time constraints or the pressure of everyday activities.
because of a lack of
Indeed, many manufacturers are now outsourcing all or some of their resources, time constraints
maintenance activities and so the contractor too is now under the or the pressure of everyday
same pressure to calibrate plant instruments quickly but accurately and
to ensure that the results are then documented for quality assurance activities.
purposes and to provide full traceability.
The purpose of calibration itself is to determine how accurate an
instrument or sensor is. Although most instruments are very accurate
these days, regulatory bodies often need to know just how inaccurate
a particular instrument is and whether it drifts in and out of specified
tolerance over time.

What is a documenting calibrator?


A documenting calibrator is a handheld electronic communication
device that is capable of calibrating many different process signals such
as pressure, temperature and electrical signals, including frequency and
pulses, and then automatically documenting the calibration results by
transferring them to a fully integrated calibration management system.

129
the benefits of using a documenting calibrator

Some calibrators can read HART, Foundation Fieldbus or Profibus


output of the transmitters and can even be used for configuring
‘smart’ sensors.
Heikki Laurila, Product Manager at Beamex in Finland comments:
“I would define a documenting calibrator as a device that has the dual
functionality of being able to save and store calibration results in its
memory, but which also integrates and automatically transfers this
information to some sort of calibration management software.”
A non-documenting calibrator is a device that does not store data,
or stores calibration data from instruments but is not integrated to
a calibration management system. Calibration results have to be
keyed manually into a separate database, spreadsheet or paper filing
The engineer does not system.
have to write any results
down on paper and so Why use a documenting calibrator?

the whole process is By using a documenting calibrator, the calibration results are stored
automatically in the calibrator’s memory during the calibration
much faster and costs process. The engineer does not have to write any results down on
are therefore reduced. paper and so the whole process is much faster and costs are therefore
reduced. Quality and accuracy of calibration results will also improve,
as there will be fewer mistakes due to human error.
The calibration results are transferred automatically from the
calibrator’s memory to the computer/database. This means the
engineer does not spend time transferring the results from his notepad
to final storage on a computer, again, saving time and money.
With instrument calibration, the calibration procedure itself is
critical. Performing the calibration procedure in the same way each
time is important for consistency of results. With a documenting
calibrator, the calibration procedure can be automatically transferred
from the computer to the handheld calibrator before going out into
the field.
As Heikki Laurila states: “Engineers who are out in the field
performing instrument calibrations, get instant pass or fail messages
with a documenting calibrator. The tolerances and limits for a sensor,
as well as detailed instructions on how to calibrate the transmitter,
are input once in the calibration management software and then
downloaded to the calibrator. This means calibrations are carried
out in the same way every time as the engineer is being told by the
calibrator which test point he needs to measure next. Also, having

130
the benefits of using a documenting calibrator

an easy-to-use documenting calibrator is definitely the way forward,


especially if calibration is one of many tasks that the user has to carry
out in his or her daily maintenance routine.”
With a multi-functioning documenting calibrator, such as the
Beamex® MC5, the user doesn’t need to carry as much equipment
while out in the field. The MC5 can be used also to calibrate, configure
and trim Foundation Fieldbus H1 or Profibus PA transmitters.
Heikki Laurila continues: “With a documenting calibrator such as
the MC5, the user can download calibration instructions for hundreds
of different instruments into the device’s memory before going out into
the field. The corresponding calibration results for these instruments
can be saved in the device without the user having to return to his PC
in the office to download/upload data. This means the user can work With a documenting
for several days in the field.” calibrator such as the
Having a fully integrated calibration management system – using
documenting calibrators and calibration management software MC5, the user can
– is important. Beamex® CMX Calibration Software ensures download calibration
that calibration procedures are carried out at the correct time and
instructions for hundreds
that calibration tasks do not get forgotten, overlooked or become
overdue. of different instruments into
the device’s memory before
Benefits in Practice – Northern Energy Services Ltd. going out into the field.
Northern Energy Services (NES) Limited , Doncaster based in UK is
using Beamex® CMX Calibration Software and the Beamex® MC5
Multifunction Calibrator to carry out instrument calibrations for its
customers.
NES is a service provider for the power generation and petrochemicals
industries. NES carries out a range of services for its clients, including
routine maintenance, calibrations, installations, electrical services,
fault diagnosis and repair, ATEX inspections and re-certifications.
David Tuczemskyi, Control & Instrumentation Engineer at NES
has worked in his role for more than 10 years. He comments: “Most of
our work is for gas-fired power stations in the UK and Ireland, which
involves a significant amount of instrument calibrations. Typically,
once a year for a major inspection, the plant will shut down for a
month and we are called in to carry out all instrument calibrations
across the site.”
In Tuczemskyi’s experience, the industry is often guilty of relying on
manual paper-based systems for documenting instrument calibrations.

131
the benefits of using a documenting calibrator

As he explains: “Calibration is done manually, which takes longer


and is prone to manual error. Often, the field engineer calibrates the
instrument, handwrites the results onto a paper form of some kind,
and then has to re-enter this information into a database on his or her
return to the office. Unintentional errors often creep in here and the
whole process is time consuming.”
So, in the last few years, NES has issued its team of engineers
with the MC5 documenting calibrator and the company has also
bought the CMX calibration software. “We are actively pushing our
customers, the power generation plants, to use Beamex’s hardware
and software,” he explains. Why? Because, says Tuczemskyi, you
get higher accuracy, the calibration process is much faster and the
“What surprised me most customer gets full traceability. “What surprised me most when I started
when I started using using the MC5 was that the calibration tasks were being done much
faster but also more accurately. In our industry, a faster job usually
the MC5 was that the means the engineer has cut corners. When you’ve got to calibrate
calibration tasks were being 1,000 instruments across a site, typically with five-point checks on
each instrument, speed and accuracy are critical.”
done much faster but also
After completing instrument calibrations, NES typically provides
more accurately. In our its clients with a full quality assurance report of all instruments that
industry, a faster job usually have been calibrated at the site, along with a calibration certificate if
required. “This not only ensures full traceability but looks professional
means the engineer has and reflects well on us as a service provider,” states Tuczemskyi.
cut corners.” Over the years, the biggest change that Tuczemskyi has seen in the
UK power industry is in regulations and the auditing process. “You
simply cannot get away with it now. Everything you do has to be
traceable. The problem here is that the end customer has little or no
time to spend with the contractor, so the contractor has to be fully
competent in everything it does. The customer wants to outsource as
much of the maintenance and calibration work as possible these days.
MC5 enables us to perform all the required instrument calibrations on
a site, but then automatically downloads this information to the CMX
software. We perform all the necessary back-ups for the customer and
the whole process is fully integrated and efficient.”
Instruments that require calibration are normally given a priority
rating by the customer. Safety-critical instruments are often the highest
and will be checked every three to six months with lower priority
instruments only being checked once a year or less. “CMX removes
all these issues and enables users to prioritise their instruments and
then to receive automatic alerts when safety-critical sensors require

132
the benefits of using a documenting calibrator

calibrating. When it comes to plant safety, you cannot always rely on


manual paper systems and poor databases.”
Tuczemskyi also likes the fact that the combination of using the MC5
and CMX means that instructions on how to calibrate an instrument
and the order in which to calibrate can all be downloaded to the
handset while the engineer is out in the field. “We did some calibration
work for a customer on three gas turbines, two of which were running,
the other on standby. Certain instruments we had to calibrate were
on a common block trip and we knew from our own experience that
we had to calibrate these in a specific order and method, otherwise
we could inadvertently cause the one of the turbines to fail. A two-
hour shut down on a gas turbine, for example, would have cost this
customer around £100,000 in lost downtime. By using the MC5 and …the combination of
CMX software in situations like these, the contractor doesn’t have to using the MC5 and CMX
rely on experience like we did, but can download specific instructions
to the calibrator before calibrating the instruments, which ensures means that instructions
costly mistakes out in the field are never made.” on how to calibrate an
According to Tuczemskyi, it took NES engineers only two weeks to
instrument and the order
get to grips with the CMX software. “Technical support at Beamex is
absolutely superb. Three years ago, we were under immense pressure in which to calibrate can
to get a job completed, when we had a software glitch that prevented all be downloaded to the
us from uploading or downloading calibration results. We contacted
Beamex and the guys repaired our fault within two days.” handset while the engineer
is out in the field.
Company interviewed in this White Paper:
Northern Energy Services Ltd
206 Askern Road
Bentley
Doncaster
DN5 0EU
Tel: 01302 820790

133
the benefits of using a documenting calibrator

SUMMARY

The Benefits of using a documenting calibrator

•  Calibration results are automatically stored in the calibrator’s


on-board memory during the calibration procedure.
•  Calibration results are automatically transferred from the
calibrator’s memory to a computer or fully integrated
calibration management system.
•  Less paperwork and fewer manual errors.
•  Reduced costs from a faster and more efficient calibration
The calibration procedure process.
•  Improved accuracy, consistency and quality of calibration
itself is guided by the
results.
calibrator, which uploads •  A fully traceable calibration system for the whole plant.
detailed instructions from •  The calibration procedure itself is guided by the calibrator,
which uploads detailed instructions from the computer or
the computer or calibration
calibration management software.
management software. •  No manual printing or reading of calibration instructions is
required, again saving time and money and simplifying the
process.

When should a user consider choosing


a documenting calibrator?

•  If your company performs regular, periodic calibrations


on a high volume of instruments across a plant, then
documenting calibrators ca help reduce costs, time and
effort, while also ensuring calibration results are accurate
and consistent.
•  If your industry is highly regulated, documenting calibrators
ensure your calibration process is fully traceable.

134
135
136
the safest way to calibrate

The safest way to calibrate


– an introduction to intrinsically
safe calibrators

There are industrial environments where calibrations A hazardous atmosphere


should not only be made accurately and efficiently, but is an area that contains
also safely. When safety becomes a top priority issue in
elements that may cause
calibration, intrinsically safe calibrators enter into the
picture. an explosion.

What is intrinsically safe calibration?


By definition, intrinsic safety (IS) is a protection technique for safely
operating electronic equipment in explosive environments. The
concept has been developed for safely operating process control
instrumentation in hazardous areas. The idea behind intrinsic safety
is to make sure that the available electrical and thermal energy in a
system is always low enough that ignition of the hazardous atmosphere
cannot occur. A hazardous atmosphere is an area that contains
elements that may cause an explosion: source of ignition, a flammable
substance and oxygen.
Hazardous area classifications in IEC/European countries are:

Zone 0: an explosive gas & air mixture is continuously present or


present for a long time.
Zone 1: an explosive gas & air mixture is likely to occur in normal
operation.
Zone 2: an explosive gas & air mixture is not likely to occur in
normal operation, and if it occurs it will exist only for a short time.

137
the safest way to calibrate

An intrinsically safe calibrator is therefore designed to be incapable


of causing ignition in the surrounding environment with flammable
materials, such as gases, mists, vapors or combustible dust. Intrinsically
safe calibrators are also often referred to being “Ex calibrators”,
“calibrators for Ex Areas”, or “IS calibrators”. An Ex Area also refers
to an explosive environment and an Ex calibrator is a device designed
for use in the type of environment in question.

Where is intrinsically safe calibration required?


Many industries require intrinsically safe calibration equipment.
Intrinsically safe calibrators are designed for potentially explosive
Basically, any potentially environments, such as oil refineries, rigs and processing plants, gas
explosive industrial pipelines and distribution centres, petrochemical and chemical plants,
as well as pharmaceutical plants. Basically, any potentially explosive
environment can benefit industrial environment can benefit from using intrinsically safe
from using intrinsically calibrators.
safe calibrators.
What are the benefits of using intrinsically safe calibrators?
There are clear benefits in using intrinsically safe calibration
equipment. First of all, it is the safest possible technique. Secondly,
the calibrators provide performance and functionality.
Safest possible technique. Intrinsically safe calibrators are safe
for employees, as they can be safely used in environments where the
risk of an explosion exists. In addition, intrinsically safe calibrators are
the only technique permitted for Zone 0 environments (explosive gas
and air mixture is continuously present or present for a long time).
Performance and functionality. Multifunctional intrinsically
safe calibrators provide the functionality and performance of regular
industrial calibration devices, but in a safe way. They can be used
for calibration of pressure, temperature and electrical signals. A
documenting intrinsically safe calibrator, such as the Beamex® MC5-
IS, provides additional efficiency improvements with its seamless
communication with calibration software. This eliminates the need
of manual recording of calibration data and improves the quality and
productivity of the entire calibration process.

138
the safest way to calibrate

Are intrinsically safe calibrators technically different from regular


industrial calibrators?
Intrinsically safe calibrators are different from other industrial calibrators
in both design and technical features. In view of safety, there are also
some guidelines and constraints for how to use them in hazardous areas.
Every intrinsically safe calibrator is delivered with a product safety note,
which should be read carefully before using the device. The product
safety note lists all the “do’s and don’ts” for safe calibration.
The differences in design and technical features were made with
one purpose in mind — to ensure that the device is safe to use and
is unable to cause an ignition. The surface of the device is made of
conductive material. The battery of an intrinsically safe calibrator Many times intrinsically
is usually slower to charge and it discharges quicker. Many times safe equipment operate
intrinsically safe equipment operate only with dry batteries, but
the Beamex® intrinsically safe calibrators operate with chargeable only with dry batteries, but
batteries. When charging the battery, it must be done in a non-Ex area. the Beamex® intrinsically
External pressure modules can be used with IS-calibrators, but they
safe calibrators operate
must also be intrinsically safe. There are also usually small differences
with electrical ranges compared to regular industrial calibrators (e.g. with chargeable batteries.
maximum is lower).

Making a calibrator safe and unable to cause ignition


– typical technical differences:
•  Surface made of conductive material
•  Constraints in using the device (listed in Product Safety Note)
•  Small differences with electrical ranges (e.g. maximum is lower)
•  Battery slower to charge, quicker to discharge
•  Battery must be charged in a non-Ex area
•  When using external pressure modules,
they must be IS-versions

What are ATEX and IECEx?


ATEX (“ATmosphères EXplosibles”, explosive atmospheres in French)
is a standard set in the European Union for explosion protection in the
industry. ATEX 95 equipment directive 94/9/EC concerns equipment

139
the safest way to calibrate

intended for use in potentially explosive areas. Companies in the


EU where the risk of explosion is evident must also use the ATEX
guidelines for protecting the employees. In addition, the ATEX rules
are obligatory for electronic and electrical equipment that will be
used in potentially explosive atmospheres sold in the EU as of July
1, 2003.
IEC (International Electrotechnical Commission) is a non-profit
international standards organization that prepares and publishes
International Standards for electrical technologies. The IEC TC/31
technical committee deals with the standards related to equipment for
explosive atmospheres. IECEx is an international scheme for certifying
procedures for equipment designed for use in explosive atmospheres.
The most important thing The objective of the IECEx Scheme is to facilitate international trade
to remember is that an in equipment and services for use in explosive atmospheres, while
maintaining the required level of safety.
intrinsically safe calibrator As Beamex® MC5-IS and the new Beamex® MC2-IS Intrinsically
must maintain its intrinsic Safe Multifunction Calibrators are certified according to ATEX and
the IECEx Scheme, it ensures the calibrators are fit for their intended
safety after the service
purpose and that sufficient information is supplied with them to
or repair. ensure that they can be used safely.

Is service different for intrinsically safe calibrators?


There are certain aspects that need special attention when doing
service or repair on an intrinsically safe calibrator. The most important
thing to remember is that an intrinsically safe calibrator must maintain
its intrinsic safety after the service or repair. The best way to do this
is to send it to the manufacturer or to an authorized service company
for repair.
Recalibration can be done by calibration laboratories (still preferably
with ISO/IEC 17025 accreditation).

Make it safe with the Beamex® Intrinsically Safe Multifunction


Calibrators
Beamex offers two different calibrators for use in potentially explosive
environments.
The new ATEX and IECEx certified Beamex® MC2-IS Multifunction
Calibrator is a practical tool designed to be used in explosive
environments. It has calibration capabilities for pressure, temperature

140
the safest way to calibrate

and electrical signals and it connects to almost 20 available Beamex


intrinsically safe external pressure modules. It has a compact size and
design and it is very user-friendly.
The Beamex® MC5-IS Intrinsically Safe Multifunction Calibrator
is a high accuracy, all-in-one calibrator for extreme environments.
Being an all-in-one calibrator, the MC5-IS replaces many individual
measurement devices and calibrators. The MC5-IS is also ATEX and
IECEx certified. The MC5-IS has calibration capabilities for pressure,
temperature, electrical and frequency signals. It is a documenting
calibrator, which means that it communicates seamlessly with
calibration software. Using documenting calibrators with calibration
software can remarkably improve the efficiency and quality of the entire
calibration process. The MC5-IS also has HART® communication.
Both the MC5-IS and the MC2-IS are certified in accordance with
the IECEx and ATEX directive.

Beamex® MC2-IS Intrinsically


Safe Multifunction Calibrator.
The new practical tool for
calibration in potentially
explosive environments.

141
142
software for calibration management

Why use software for


calibration management?

Every plant has some sort of system in place for Calibration software
managing calibration operations and data, but the is one such tool that
different methods for doing it varies greatly in terms
can be used to support
of cost, quality, efficiency and accuracy of data.
and guide calibration
Introduction management activities,
Every manufacturing plant has some sort of system in place for with documentation being
managing instrument calibration operations and data. Plant a critical part of this.
instrumentation devices such as temperature sensors, pressure
transducers and weighing instruments – require regular calibration to
ensure they are performing and measuring to specified tolerances.
However, different companies from a diverse range of industry
sectors use very different methods of managing these calibrations.
These methods differ greatly in terms of cost, quality, efficiency, and
accuracy of data and their level of automation.
Calibration software is one such tool that can be used to support and
guide calibration management activities, with documentation being
a critical part of this.
But in order to understand how software can help process plants
better manage their instrument calibrations, it is important to consider
the typical calibration management tasks that companies have to
undertake. There are five main areas here, comprising of planning
and decision-making; organisation; execution; documentation; and
analysis.
Careful planning and decision-making is important. All plant

143
software for calibration management

instruments and measurement devices need to be listed, then classified


into ‘critical’ and ‘non-critical’ devices. Once this has been agreed,
the calibration range and required tolerances need to be identified.
Decisions then need to be made regarding the calibration interval for
each instrument. The creation and approval of standard operating
procedures (SOPs) for each device is then required, followed by the
selection of suitable calibration methods and tools for execution of
these methods. Finally, the company must identify current calibration
status for every instrument across the plant.
The next stage, organisation, involves training the company’s
calibration staff – typically maintenance technicians, service engineers,
process and quality engineers and managers – in using the chosen tools
All plant instruments and and how to follow the approved SOPs. Resources then have to be
measurement devices organised and assigned to actually carry out the scheduled calibration
tasks.
need to be listed, then The execution stage involves supervising the assigned calibration
classified into ‘critical’ tasks. Staff carrying out these activities must follow the appropriate
instructions before calibrating the device, including any associated
and ‘non-critical’ devices.
safety procedures. The calibration is then executed according to the
plan, although further instructions may need to be followed after
calibration.
The documentation and storage of calibration results typically
involves signing and approving all calibration records that are generated.
The next calibration tasks then have to be scheduled, calibration labels
need to be created and pasted, then created documents copied and
archived.
Based on the calibration results, companies then have to analyse the
data to see if any corrective action needs to be taken. The effectiveness
of calibration needs to be reviewed and calibration intervals checked.
These intervals may need to be adjusted based on archived calibration
history. If, for example, a sensor drifts out of its specification range,
the consequences could be disastrous for the plant, resulting in costly
production downtime, a safety problem or leading to batches of
inferior quality goods being produced, which may then have to be
scrapped.

Documentation
Documentation is a very important part of a calibration management
process. ISO 9001:2000 and the FDA both state that calibration

144
software for calibration management

records must be maintained and that calibration must be carried out


according to written, approved procedures.
This means an instrument engineer can spend as much as 50 per
cent of his or her time on documentation and paperwork – time that
could be better spent on other value-added activities. This paperwork
typically involves preparing calibration instructions to help field
engineers; making notes of calibration results in the field; and
documenting and archiving calibration data.
Imagine how long and difficult a task this is if the plant has
thousands of instruments that require calibrating on at least a six-
monthly basis? The amount of manual documentation increases
almost exponentially!
When it comes to the volume of documentation required, different This means an instrument
industry sectors have different requirements and regulations. In the engineer can spend
Power & Energy sector, for example, just under a third of companies
(with 500+ employees) typically have more than 5,000 instruments as much as 50 per
that require calibrating. 42 per cent of companies perform more than cent of his or her time
2,000 calibrations each year.
on documentation and
In the highly regulated pharmaceuticals sector, a massive 75 per cent
of companies carry out more than 2,000 calibrations per year. Oil, paperwork – time that
Gas & Petrochemicals is similarly high, with 55 per cent of companies could be better spent
performing more than 2,000 calibrations each year. The percentage
is still quite high in the food & beverage sector, where 21 per cent of on other value-added
firms said they calibrated their instruments more than 2,000 times activities.
every year. This equates to a huge amount of paperwork for any process
plant.
The figures outlined appear to suggest that companies really
do require some sort of software tool to help them manage their
instrument calibration processes and all associated documentation.
However, the picture in reality can be very different.

Only a quarter of companies use calibration software


In Beamex’s own Calibration Study carried out in 2007, a mere 25
per cent of companies with 500+ employees (across the industry
sectors mentioned above) said that they did use specialist calibration
management software. Many other companies said that they relied
on generic spreadsheets and/or databases for this, whilst others used
a calibration module within an existing Computerised Maintenance
Management System (CMMS). A significant proportion (almost

145
software for calibration management

20 per cent) of those surveyed said they used a manual, paper-based


system.
Any type of paper-based calibration system will be prone to human
error. Noting down calibration results by hand in the field and then
transferring these results into a spreadsheet back at the office may
seem archaic, but many firms still do this. Furthermore, analysis of
paper-based systems and spreadsheets can be almost impossible, let
alone time consuming.
In a recent survey conducted by Control Magazine, 40 per cent of
companies surveyed said that they calculated calibration intervals by
using historical trend analysis – which is encouraging. However, many
of these firms said they were doing it without any sort of calibration
Using software for software to assist them. The other 60 per cent of companies determined
instrument calibration intervals based on either the manufacturer’s own
calibration management
recommendation, or they used a uniform interval across the plant for
enables faster, easier and all instruments. Neither method is ideal in practice. Companies could
more accurate analysis save so much time and reduce costs by using calibration management
software to analyse historical trends and calibration results.
of calibration records and Using software for calibration management enables faster, easier and
identifying historical trends. more accurate analysis of calibration records and identifying historical
trends. Plants can therefore reduce costs and optimise calibration
intervals by reducing calibration frequency when this is possible, or
by increasing the frequency where necessary.
For example, for improved safety, a process plant may find it
necessary to increase the frequency of some sensors that are located in
a hazardous, potentially explosive area of the manufacturing plant.
Just as important, by analysing the calibration history of a flow
meter that is located in a ‘non-critical’ area of the plant, the company
may be able to decrease the frequency of calibration, saving time and
resources. Rather than rely on the manufacturer’s recommendation for
calibration intervals, the plant may be able to extend these intervals by
looking closely at historical trends provided by calibration management
software. Instrument ‘drift’ can be monitored closely over a period of
time and then decisions taken confidently with respect to amending
the calibration interval.
Regardless of industry sector, there seems to be some general
challenges that companies face when it comes to calibration
management.
The number of instruments and the total number of periodic
calibrations that these devices require can be several thousand per year.

146
software for calibration management

How to plan and keep track of each instrument’s calibration procedures


means that planning and scheduling is important. Furthermore, every
instrument calibration has to be documented and these documents
need to be easily accessible for audit purposes.

Paper-based systems
These systems typically involve hand-written documents. Typically,
this might include engineers using pens and paper to record calibration
results while out in the field. On returning to the office, these notes
are then tidied up or transferred to another paper document, after
which they are archived as paper documents.
While using a manual, paper-based system requires little or no Regardless of industry
investment, it is very labour-intensive and means that historical trend sector, there seems to be
analysis becomes very difficult to carry out. In addition, the calibration
data is not easily accessible. The system is time consuming, soaks up a some general challenges
lot of resources and typing errors are commonplace. Dual effort and that companies face when
re-keying of calibration data are also significant costs here.
it comes to calibration
management.
In-house legacy systems (spreadsheets, databases, etc.)
Although certainly a step in the right direction, using an in-house
legacy system to manage calibrations has its drawbacks. In these
systems, calibration data is typically entered manually into a
spreadsheet or database. The data is stored in electronic format, but
the recording of calibration information is still time-consuming and
typing errors are common. Also, the calibration process itself cannot
be automated. For example, automatic alarms cannot be set up on
instruments that are due for calibration.

Calibration module of a CMMS


Many plants have already invested in a Computerised Maintenance
Management (CMM) system and so continue to use this for calibration
management. Plant hierarchy and works orders can be stored in the
CMM system, but the calibration cannot be automated because the
system is not able to communicate with ‘smart’ calibrators.
Furthermore, CMM systems are not designed to manage calibrations
and so often only provide the minimum calibration functionality, such
as the scheduling of tasks and entry of calibration results. Although

147
software for calibration management

instrument data can be stored and managed efficiently in the plant’s


database, the level of automation is still low. In addition, the CMM
system may not meet the regulatory requirements (e.g. FDA) for
managing calibration records.

Calibration Software
With specialist calibration management software, users are provided
with an easy-to-use Windows Explorer-like interface. The software
manages and stores all instrument and calibration data. This
includes the planning and scheduling of calibration work; analysis
and optimisation of calibration frequency; production of reports,
Using software for certificates and labels; communication with smart calibrators; and
calibration management easy integration with CMM systems such as SAP and Maximo. The
result is a streamlined, automated calibration process, which improves
enables faster, easier and quality, plant productivity and efficiency.
more accurate analysis
of calibration records and Benefits of Using Calibration Software
identifying historical trends. With software-based calibration management, planning and decision-
making are improved. Procedures and calibration strategies can be
planned and all calibration assets managed by the software. Position,
device and calibrator databases are maintained, while automatic alerts
for scheduled calibrations can be set up.
Organisation also improves. The system no longer requires pens and
paper. Calibration instructions are created using the software to guide
engineers through the calibration process. These instructions can also
be downloaded to a technician’s handheld documenting calibrator
while they are in the field.
Execution is more efficient and errors are eliminated. Using
software-based calibration management systems in conjunction with
documenting calibrators means that calibration results can be stored
in the calibrator’s memory, then automatically uploaded back to the
calibration software. There is no re-keying of calibration results from
a notebook to a database or spreadsheet. Human error is minimised
and engineers are freed up to perform more strategic analysis or other
important activities.
Documentation is also improved. The software generates reports
automatically and all calibration data is stored in one database rather
than multiple disparate systems. Calibration certificates, reports and

148
software for calibration management

labels can all be printed out on paper or sent in electronic format.


CHECKLIST
Analysis becomes easier too, enabling engineers to optimise calibration
intervals using the software’s History Trend function.
Also, when a plant is being audited, calibration software can Choosing the right
facilitate both the preparation and the audit itself. Locating records calibration software
and verifying that the system works is effortless when compared to
traditional calibration record keeping. •  Is it easy to use?
Regulatory organisations and standards such as FDA and ISO •  What are the specific
place demanding requirements on the recording of calibration data. requirements in terms
Calibration software has many functions that help in meeting these of functionality?
requirements, such as Change Management, Audit Trail and Electronic
•  Are there any IT
Signature functions. The Change Management feature in Beamex’s
requirements or
CMX software, for example, complies with FDA requirements.
restrictions for choosing
the software?
Business Benefits •  Does the calibration
For the business, implementing software-based calibration management software need to be
means overall costs will be reduced. These savings come from the integrated with the plant’s
now-paperless calibration process, with no manual documentation existing systems?
procedures. Engineers can analyse calibration results to see whether •  Is communication with
the calibration intervals on plant instruments can be altered. For smart calibrators a
example, those instruments that perform better than expected may requirement?
well justify a reduction in their calibration frequency.
•  Does the supplier offer
Plant efficiencies should also improve, as the entire calibration process
training, implementation,
is now streamlined and automated. Manual procedures are replaced
support and upgrades?
with automated, validated processes, which is particularly beneficial if
the company is replacing a lot of labour-intensive calibration activities. •  Does the calibration
Costly production downtime will also be reduced. software need to be
Even if a plant has already implemented a CMM system, calibration scalable?
management software can be easily integrated to this system. If the •  Can data be imported
plant instruments are already defined on a database, the calibration to the software from the
management software can utilise the records available in the CMM plant’s current systems?
system database.
•  Does the software offer
The integration will save time, reduce costs and increase productivity
regulatory compliance?
by preventing unnecessary double effort and re-keying of works orders
in multiple systems. Integration also enables the plant to automate its •  Supplier’s references and
calibration management with smart calibrators, which simply is not experience as a software
possible with a standalone CMM system. developer?

149
software for calibration management

SUMMARY Benefits for all process plants


Beamex’s suite of calibration management software can benefit all
Calibration Software sizes of process plant. For relatively small plants, where calibration
improves calibration data is needed for only one location, only a few instruments require
management tasks calibrating and where regulatory compliance is minimal, Beamex
in all these areas CMX Light is the most appropriate software.
For medium-to-large sized companies that have multiple users who
•  Planning & have to deal with a large amount of instruments and calibration work,
Decision-Making as well as strict regulatory compliance, Beamex CMX Professional is
•  Organisation ideal.
•  Execution Beamex’s high-end solution, CMX Enterprise, is suitable for process
•  Documentation manufacturers with multiple global sites, multilingual users and a very
•  Analysis large amount of instruments that require calibration. Here, a central
calibration management database is often implemented that is used
by multiple plants across the world.
The Business Benefits
of using software for
Beamex Users
Calibration Management
In 2008, Beamex conducted a survey of its customers, across all
•  Cost reduction industry sectors. The results showed that 82% of CMX Calibration
•  Quality improvements software customers said that using Beamex products had resulted in
•  Increase in efficiency cost savings in some part of their operations.
94% of CMX users stated that using Beamex products had improved
the efficiency of their calibration processes, whilst 92% said that using
CMX had improved the quality of their calibration system.

Summary
Every type of process plant, regardless of industry sector, can benefit
from implementing specialist calibration management software.
Compared to traditional, paper-based systems, in-house built legacy
calibration systems or calibration modules with CMM systems, using
dedicated calibration management software results in improved
quality, increased productivity and reduced costs of the entire
calibration process.
Despite these benefits, only one quarter of companies who need
to manage instrument calibrations actually use software designed for
that purpose.

150
151
appendix: calibration terminology a to z 1

Calibration terminology
1
A to Z

Introduction

T
his glossary is a quick reference to the meaning of common
terms. It is a supplement to the VIM, GUM, NCSL Glossary,
and the information in the other references listed at the end.
In technical, scientific and engineering work (such as metrology)
it is important to correctly use words that have a technical meaning.
Definitions of these words are in relevant national, international
and industry standards; journals; and other publications, as well as
publications of relevant technical and professional organizations.
Those documents give the intended meaning of the word, so everyone
in the business knows what it is. In technical work, only the technical
definitions should be used.
Many of these definitions are adapted from the references. In some
cases several may be merged to better clarify the meaning or adapt
the wording to common metrology usage. The technical definitions
may be different from the definitions published in common grammar
dictionaries. However, the purpose of common dictionaries is to record
the ways that people actually use words, not to standardize the way
the words should be used. If a word is defined in a technical standard, its
definition from a common grammar dictionary should never be used in work
where the technical standard can apply.

153
appendix: calibration terminology a to z 1

Terms that are not in this glossary may be found in one of these
primary references:
1. ISO. 1993. International vocabulary of basic and general terms in
metrology (called the VIM); BIPM, IEC, IFCC, ISO, IUPAC,
IUPAP, and OIML. Geneva: ISO.
2. ANSI/NCSL. 1997. ANSI/NCSL Z540-2-1997, U. S. Guide to the
expression of uncertainty in measurement (called the GUM). Boulder,
CO: NCSL International.
3. NCSL. 1999. NCSL Glossary of metrology-related terms. 2nd ed.
Boulder, CO: NCSL International.

Some terms may be listed in this glossary in order to expand on


the definition, but should be considered an addition to the references
listed above, not a replacement of them. (It is assumed that a calibration
or metrology activity owns copies of these as part of its basic reference
material.)

Glossary
Accreditation (of a laboratory) – Formal recognition by an accreditation
body that a calibration or testing laboratory is able to competently
perform the calibrations or tests listed in the accreditation
scope document. Accreditation includes evaluation of both the
quality management system and the competence to perform the
measurements listed in the scope.
Accreditation body – An organization that conducts laboratory
accreditation evaluations in conformance to ISO Guide 58.
Accreditation certificate – Document issued by an accreditation
body to a laboratory that has met the conditions and criteria for
accreditation. The certificate, with the documented measurement
parameters and their best uncertainties, serves as proof of accredited
status for the time period listed. An accreditation certificate
without the documented parameters is incomplete.
Accreditation criteria – Set of requirements used by an accrediting
body that a laboratory must meet in order to be accredited.
Accuracy (of a measurement) – Accuracy is a qualitative indication of
how closely the result of a measurement agrees with the true value
of the parameter being measured. (VIM, 3.5) Because the true
value is always unknown, accuracy of a measurement is always
an estimate. An accuracy statement by itself has no meaning

154
appendix: calibration terminology a to z 1

other than as an indicator of quality. It has quantitative value


only when accompanied by information about the uncertainty
of the measuring system. Contrast with: accuracy (of a measuring
instrument)
Accuracy (of a measuring instrument) – Accuracy is a qualitative
indication of the ability of a measuring instrument to give
responses close to the true value of the parameter being measured.
(VIM, 5.18) Accuracy is a design specification and may be verified
during calibration. Contrast with: accuracy (of a measurement)
Assessment – Examination typically performed on-site of a testing or
calibration laboratory to evaluate its conformance to conditions
and criteria for accreditation.
Best measurement capability – For an accredited laboratory, the best
measurement capability for a particular quantity is “the smallest
uncertainty of measurement a laboratory can achieve within its
scope of accreditation when performing more or less routine
calibrations of nearly ideal measurement standards intended to
define, realize, conserve, or reproduce a unit of that quantity or
one or more of its values; or when performing more-or-less routine
calibrations of nearly ideal measuring instruments designed for the
measurement of that quantity.” (EA-4/02) The best measurement
capability is based on evaluations of actual measurements
using generally accepted methods of evaluating measurement
uncertainty.
Bias – Bias is the known systematic error of a measuring instrument.
(VIM, 5.25) The value and direction of the bias is determined by
calibration and/or gage R&R studies. Adding a correction, which
is always the negative of the bias, compensates for the bias. See also:
correction, systematic error
Calibration – (1). (See VIM 6.11 and NCSL pages 4–5 for primary
and secondary definitions.) Calibration is a term that has many
different – but similar – definitions. It is the process of verifying
the capability and performance of an item of measuring and test
equipment by comparison to traceable measurement standards.
Calibration is performed with the item being calibrated in
its normal operating configuration – as the normal operator
would use it. The calibration process uses traceable external
stimuli, measurement standards, or artifacts as needed to verify
the performance. Calibration provides assurance that the
instrument is capable of making measurements to its performance

155
appendix: calibration terminology a to z 1

specification when it is correctly used. The result of a calibration is


a determination of the performance quality of the instrument with
respect to the desired specifications. This may be in the form of
a pass/fail decision, determining or assigning one or more values,
or the determination of one or more corrections. The calibration
process consists of comparing an IM&TE unit with specified
tolerances, but of unverified accuracy, to a measurement system
or device of specified capability and known uncertainty in order
to detect, report, or minimize by adjustment any deviations from
the tolerance limits or any other variation in the accuracy of the
instrument being compared. Calibration is performed according
to a specified documented calibration procedure, under a set of
specified and controlled measurement conditions, and with a
specified and controlled measurement system.

Notes:
•  A requirement for calibration does not imply that the item
being calibrated can or should be adjusted.
•  The calibration process may include, if necessary, calculation
of correction factors or adjustment of the instrument being
compared to reduce the magnitude of the inaccuracy.
•  In some cases, minor repair such as replacement of batteries,
fuses, or lamps, or minor adjustment such as zero and span,
may be included as part of the calibration.
•  Calibration does not include any maintenance or repair actions
except as just noted. See also: performance test, calibration
procedure Contrast with: calibration (2) and repair

Calibration – 2 A) Many manufacturers incorrectly use the term


calibration to name the process of alignment or adjustment of
an item that is either newly manufactured or is known to be out
of tolerance, or is otherwise in an indeterminate state. Many
calibration procedures in manufacturers’ manuals are actually
factory alignment procedures that only need to be performed if a
UUC is in an indeterminate state because it is being manufactured,
is known to be out of tolerance, or after it is repaired. When used
this way, calibration means the same as alignment or adjustment,
which are repair activities and excluded from the metrological
definition of calibration.
(B) In many cases, IM&TE instruction manuals may use

156
appendix: calibration terminology a to z 1

calibration to describe tasks normally performed by the operator


of a measurement system. Examples include performing a self-
test as part of normal operation or performing a self-calibration
(normalizing) a measurement system before use. When calibration
is used to refer to tasks like this, the intent is that they are part
of the normal work done by a trained user of the system. These
and similar tasks are excluded from the metrological definition of
calibration. Contrast with: calibration (1) See also: normalization,
self-calibration, standardization
Calibration activity or provider – A laboratory or facility – including
personnel – that perform calibrations in an established location or
at customer location(s). It may be external or internal, including
subsidiary operations of a larger entity. It may be called a calibration
laboratory, shop, or department; a metrology laboratory or
department; or an industry-specific name; or any combination or
variation of these.
Calibration certificate – (1) A calibration certificate is generally a
document that states that a specific item was calibrated by an
organization. The certificate identifies the item calibrated, the
organization presenting the certificate, and the effective date. A
calibration certificate should provide other information to allow
the user to judge the adequacy and quality of the calibration. (2)
In a laboratory database program, a certificate often refers to the
permanent record of the final result of a calibration. A laboratory
database certificate is a record that cannot be changed; if it is
amended later a new certificate is created. See also: calibration
report
Calibration procedure – A calibration procedure is a controlled
document that provides a validated method for evaluating and
verifying the essential performance characteristics, specifications,
or tolerances for a model of measuring or testing equipment.
A calibration procedure documents one method of verifying
the actual performance of the item being calibrated against its
performance specifications. It provides a list of recommended
calibration standards to use for the calibration; a means to record
quantitative performance data both before and after adjustments;
and information sufficient to determine if the unit being calibrated
is operating within the necessary performance specifications. A
calibration procedure always starts with the assumption that the
unit under test is in good working order and only needs to have

157
appendix: calibration terminology a to z 1

its performance verified. Note: A calibration procedure does not


include any maintenance or repair actions.
Calibration program – A calibration program is a process of the
quality management system that includes management of the
use and control of calibrated inspection, and test and measuring
equipment (IM&TE), and the process of calibrating IM&TE used
to determine conformance to requirements or used in supporting
activities. A calibration program may also be called a measurement
management system (ISO 10012:2003).
Calibration report – A calibration report is a document that provides
details of the calibration of an item. In addition to the basic items
of a calibration certificate, a calibration report includes details of
the methods and standards used, the parameters checked, and the
actual measurement results and uncertainty. See also: calibration
certificate
Calibration seal – A calibration seal is a device, placard, or label that,
when removed or tampered with, and by virtue of its design and
material, clearly indicates tampering. The purpose of a calibration
seal is to ensure the integrity of the calibration. A calibration seal
is usually imprinted with a legend similar to “Calibration Void
if Broken or Removed” or “Calibration Seal – Do Not Break or
Remove.” A calibration seal provides a means of deterring the user
from tampering with any adjustment point that can affect the
calibration of an instrument and detecting an attempt to access
controls that can affect the calibration of an instrument. Note: A
calibration seal may also be referred to as a tamper seal.
Calibration standard – (See VIM, 6.1 through 6.9, and 6.13, 6.14; and
NCSL pages 36–38.) A calibration standard is an IM&TE item,
artifact, standard reference material, or measurement transfer
standard that is designated as being used only to perform calibrations
of other IM&TE items. As calibration standards are used to
calibrate other IM&TE items, they are more closely controlled and
characterized than the workload items they are used for. Calibration
standards generally have lower uncertainty and better resolution than
general-purpose items. Designation as a calibration standard is based
on the use of the specific instrument, however, not on any other
consideration. For example, in a group of identical instruments, one
might be designated as a calibration standard while the others are
all general purpose IM&TE items. Calibration standards are often
called measurement standards. See also: standard (measurement)

158
appendix: calibration terminology a to z 1

Combined standard uncertainty – The standard uncertainty of the


result of a measurement, when that result is obtained from the
values of a number of other quantities. It is equal to the positive
square root of a sum of terms. The terms are the variances or
covariances of these other quantities, weighted according to how
the measurement result varies with changes in those quantities.
(GUM, 2.3.4) See also: expanded uncertainty
Competence – For a laboratory, the demonstrated ability to perform
the tests or calibrations within the accreditation scope and to meet
other criteria established by the accreditation body. For a person,
the demonstrated ability to apply knowledge and skills. Note: The
word qualification is sometimes used in the personal sense, since it
is a synonym and has more accepted usage in the United States.
Confidence interval – A range of values that is expected to contain the
true value of the parameter being evaluated with a specified level
of confidence. The confidence interval is calculated from sample
statistics. Confidence intervals can be calculated for points, lines,
slopes, standard deviations, and so on. For an infinite (or very large
compared to the sample) population, the confidence interval is:


s p (1 – p)
CI = x̄ ± t = –––   or   CI = p ±  –––––––––
n    n

where
CI is the confidence interval,
n is the number of items in the sample,
p is the proportion of items of a given type in the population,
s is the sample standard deviation,
x is the sample mean, and
t is the Student’s T value for α ⁄2 and (n – 1) (α is the level of
significance).

Correction (of error) – A correction is the value that is added to the


raw result of a measurement to compensate for known or estimated
systematic error or bias. (VIM, 3.15) Any residual amount is treated
as random error. The correction value is equal to the negative of
the bias. An example is the value calculated to compensate for
the calibration difference of a reference thermometer or for the
calibrated offset voltage of a thermocouple reference junction. See
also: bias, error, random error, systematic error

159
appendix: calibration terminology a to z 1

Corrective action – Corrective action is something done to correct a


nonconformance when it arises, including actions taken to prevent
reoccurrence of the nonconformance. Compare with: preventive
action
Coverage factor – A numerical factor used as a multiplier of the
combined standard uncertainty in order to obtain an expanded
uncertainty. (GUM, 2.3.6) The coverage factor is identified by
the symbol k. It is usually given the value 2, which approximately
corresponds to a probability of 95 percent for degrees of freedom
> 10’.
Deficiency – Nonfulfillment of conditions and/or criteria for
accreditation, sometimes referred to as a nonconformance.
Departure value – A term used by a few calibration laboratories to refer
to bias, error or systematic error. The exact meaning can usually be
determined from examination of the calibration certificate.
Equivalence – (A) Acceptance of the competence of other national
metrology institutes (NMI), accreditation bodies, and/or accredited
organizations in other countries as being essentially equal to the
NMI, accreditation body, and/or accredited organizations within
the host country.
(B) A formal, documented determination that a specific instrument
or type of instrument is suitable for use in place of the one originally
listed, for a particular application.
Error (of measurement) – (See VIM, 3.10, 3.12–3.14; and NCSL pages
11–13.) In metrology, error (or measurement error) is an estimate
of the difference between the measured value and the probable
true value of the object of the measurement. The error can
never be known exactly; it is always an estimate. Error may be
systematic and/or random. Systematic error (also known as bias)
may be corrected. See also: bias, correction (of error), random error,
systematic error
Gage R&R – Gage repeatability and reproducibility study, which
(typically) employs numerous instruments, personnel, and
measurements over a period of time to capture quantitative
observations. The data captured are analyzed statistically to obtain
best measurement capability, which is expressed as an uncertainty
with a coverage factor of k = 2 to approximate 95 percent. The
number of instruments, personnel, measurements, and length of
time are established to be statistically valid consistent with the size
and level of activity of the organization.

160
appendix: calibration terminology a to z 1

GUM – An acronym commonly used to identify the ISO Guide to the


Expression of Uncertainty in Measurement. In the United States, the
equivalent document is ANSI/NCSL Z540-2-1997, U. S. Guide to
the Expression of Uncertainty in Measurement.
IM&TE – The acronym IM&TE refers to inspection, measuring,
and test equipment. This term includes all items that fall under
a calibration or measurement management program. IM&TE
items are typically used in applications where the measurement
results are used to determine conformance to technical or quality
requirements before, during, or after a process. Some organizations
do not include instruments used solely to check for the presence
or absence of a condition (such as voltage, pressure, and so on)
where a tolerance is not specified and the indication is not critical
to safety. Note: Organizations may refer to IM&TE items as MTE
(measuring and testing equipment), TMDE (test, measuring,
and diagnostic equipment), GPETE (general purpose electronic
test equipment), PME (precision measuring equipment), PMET
(precision measuring equipment and tooling), or SPETE (special
purpose electronic test equipment).
Interlaboratory comparison – Organization, performance, and
evaluation of tests or calibrations on the same or similar items
or materials by two or more laboratories in accordance with
predetermined conditions.
Internal audit – A systematic and documented process for obtaining
audit evidence and evaluating it objectively to verify that a
laboratory’s operations comply with the requirements of its quality
system. An internal audit is done by or on behalf of the laboratory
itself, so it is a first-party audit.
International Organization for Standardization (ISO) – An international
nongovernmental organization chartered by the United Nations
in 1947, with headquarters in Geneva, Switzerland. The mission
of ISO is “to promote the development of standardization and
related activities in the world with a view to facilitating the
international exchange of goods and services, and to developing
cooperation in the spheres of intellectual, scientific, technological
and economic activity.” The scope of ISO’s work covers all fields of
business, industry and commerce except electrical and electronic
engineering. The members of ISO are the designated national
standards bodies of each country. (The United States is represented
by ANSI.) See also: ISO

161
appendix: calibration terminology a to z 1

International System of Units (SI) – A defined and coherent system of


units adopted and used by international treaties. (The acronym SI
is from the French Systéme International.) SI is international system
of measurement for all physical quantities. (Mass, length, amount
of substance, time, electric current, thermodynamic temperature,
and luminous intensity.) SI units are defined and maintained by
the International Bureau of Weights and Measures (BIPM) in Paris,
France. The SI system is popularly known as the metric system.
ISO – Iso is a Greek word root meaning equal. The International
Organization for Standardization chose the word as the short
form of the name, so it will be a constant in all languages. In this
context, ISO is not an acronym. (If the acronym was based on the
full name were used, it would be different in each language.) The
name also symbolizes the mission of the organization – to equalize
standards worldwide.
Level of confidence – Defines an interval about the measurement result
that encompasses a large fraction p of the probability distribution
characterized by that result and its combined standard uncertainty,
and p is the coverage probability or level of confidence of the
interval. Effectively, the coverage level expressed as a percent.
Management review – The planned, formal, periodic, and scheduled
examination of the status and adequacy of the quality management
system in relation to its quality policy and objectives by the
organization’s top management.
Measurement – A set of operations performed for the purpose of
determining the value of a quantity. (VIM, 2.1)
Measurement system – A measurement system is the set of equipment,
conditions, people, methods, and other quantifiable factors that
combine to determine the success of a measurement process.
The measurement system includes at least the test and measuring
instruments and devices, associated materials and accessories, the
personnel, the procedures used, and the physical environment.
Metrolog y – Metrology is the science and practice of measurement
(VIM, 2.2).
Mobile operations – Operations that are independent of an established
calibration laboratory facility. Mobile operations may include work
from an office space, home, vehicle, or the use of a virtual office.
Natural (physical) constant – A natural constant is a fundamental
value that is accepted by the scientific community as valid. Natural
constants are used in the basic theoretical descriptions of the

162
appendix: calibration terminology a to z 1

universe. Examples of natural physical constants important in


metrology are the speed of light in a vacuum (c), the triple point of
water (273.16 K), the quantum charge ratio (h/e), the gravitational
constant (G), the ratio of a circle’s circumference to its diameter
(p), and the base of natural logarithms (e).
NCSL international – Formerly known as the National Conference
of Standards Laboratories (NCSL). NCSL was formed in
1961 to “promote cooperative efforts for solving the common
problems faced by measurement laboratories. NCSL has member
organizations from academic, scientific, industrial, commercial
and government facilities around the world. NCSL is a nonprofit
organization, whose membership is open to any organization with
an interest in the science of measurement and its application in
research, development, education, or commerce. NCSL promotes
technical and managerial excellence in the field of metrology,
measurement standards, instrument calibration, and test and
measurement.”
Normalization, Normalize – See: self-calibration
Offset – Offset is the difference between a nominal value (for an
artifact) or a target value (for a process) and the actual measured
value. For example, if the thermocouple alloy leads of a reference
junction probe are formed into a measurement junction and placed
in an ice point cell, and the reference junction itself is also in the
ice point, then the theoretical thermoelectric emf measured at the
copper wires should be zero. Any value other than zero is an offset
created by inhomogeneity of the thermocouple wires combined
with other uncertainties. Compare with: bias, error
On-site operations – Operations that are based in or directly supported
by an established calibration laboratory facility, but actually
perform the calibration actions at customer locations. This includes
climate-controlled mobile laboratories.
Performance Test – A performance test (or performance verification) is
the activity of verifying the performance of an item of measuring
and test equipment to provide assurance that the instrument is
capable of making correct measurements when it is properly used.
A performance test is done with the item in its normal operating
configuration. A performance test is the same as a calibration (1).
See also: calibration (1)
Policy – A policy defines and sets out the basic objectives, goals,
vision, or general management position on a specific topic. A

163
appendix: calibration terminology a to z 1

policy describes what management intends to have done regarding


a given portion of business activity. Policy statements relevant to
the quality management system are generally stated in the quality
manual. Policies can also be in the organization’s policy/procedure
manual. See also: procedure
Precision – Precision is a property of a measuring system or instrument.
Precision is a measure of the repeatability of a measuring system – how
much agreement there is within a group of repeated measurements
of the same quantity under the same conditions. (NCSL, page 26)
Precision is not the same as accuracy. (VIM, 3.5)
Preventive action – Preventive action is something done to prevent the
possible future occurrence of a nonconformance, even though such
an event has not yet happened. Preventive action helps improve the
system. Contrast with: corrective action
Procedure – A procedure describes a specific process for implementing
all or a portion of a policy. There may be more than one procedure
for a given policy. A procedure has more detail than a policy but
less detail than a work instruction. The level of detail needed
should correlate with the level of education and training of the
people with the usual qualifications to do the work and the amount
of judgment normally allowed to them by management. Some
policies may be implemented by fairly detailed procedures, while
others may only have a few general guidelines. Calibration: see
calibration procedure. See also: policy
Proficiency testing – Determination of laboratory testing performance
by means of interlaboratory comparisons.
Quality manual – The quality manual is the document that describes
the quality management policy of an organization with respect to a
specified conformance standard. The quality manual briefly defines
the general policies as they apply to the specified conformance
standard and affirms the commitment of the organization’s top
management to the policy. In addition to its regular use by the
organization, auditors use the quality manual when they audit
the quality management system. The quality manual is generally
provided to customers on request. Therefore, it does not usually
contain any detailed policies and never contains any procedures,
work instructions, or proprietary information.
Random error – Random error is the result of a single measurement of
a value, minus the mean of a large number of measurements of the
same value. (VIM, 3.13) Random error causes scatter in the results

164
appendix: calibration terminology a to z 1

of a sequence of readings and, therefore, is a measure of dispersion.


Random error is usually evaluated by Type A methods, but Type
B methods are also used in some situations. Note: Contrary to
popular belief, the GUM specifically does not replace random
error with either Type A or Type B methods of evaluation. See also:
error Compare with: systematic error
Repair – Repair is the process of returning an unserviceable or
nonconforming item to serviceable condition. The instrument is
opened, or has covers removed, or is removed from its case and
may be disassembled to some degree. Repair includes adjustment
or alignment of the item as well as component-level repair. (Some
minor adjustment such as zero and span may be included as part of
the calibration.) The need for repair may be indicated by the results
of a calibration. For calibratable items, repair is always followed by
calibration of the item. Passing the calibration test indicates success
of the repair. Contrast with: calibration (1), repair (minor)
Repair (minor) – Minor repair is the process of quickly and economically
returning an unserviceable item to serviceable condition by doing
simple work using parts that are in stock in the calibration lab.
Examples include replacement of batteries, fuses, or lamps; or
minor cleaning of switch contacts; or repairing a broken wire; or
replacing one or two in-stock components. The need for repair may
be indicated by the results of a calibration. For calibratable items,
minor repair is always followed by calibration of the item. Passing
the calibration test indicates success of the repair. Minor repairs are
defined as repairs that take no longer than a short time as defined
by laboratory management, and where no parts have to be ordered
from external suppliers, and where substantial disassembly of the
instrument is not required. Contrast with: calibration (1), repair
Reported value – One or more numerical results of a calibration
process, with the associated measurement uncertainty, as recorded
on a calibration report or certificate. The specific type and format
vary according to the type of measurement being made. In general,
most reported values will be in one of these formats:
•  Measurement result and uncertainty. The reported value is
usually the mean of a number of repeat measurements. The
uncertainty is usually expanded uncertainty as defined in the
GUM.
•  Deviation from the nominal (or reference) value and
uncertainty. The reported value is the difference between

165
appendix: calibration terminology a to z 1

the nominal value and the mean of a number of repeat


measurements. The uncertainty of the deviation is usually
expanded uncertainty as defined in the GUM.
•  Estimated systematic error and uncertainty. The value may be
reported this way when it is known that the instrument is part
of a measuring system and the systematic error will be used
to calculate a correction that will apply to the measurement
system results.
Round robin – See: Interlaboratory Comparison
Scope of accreditation – For an accredited calibration or testing
laboratory, the scope is a documented list of calibration or testing
fields, parameters, specific measurements, or calibrations and
their best measurement, uncertainty. The scope document is an
attachment to the certificate of accreditation and the certificate is
incomplete without it. Only the calibration or testing areas that
the laboratory is accredited for are listed in the scope document,
and only the listed areas may be offered as accredited calibrations
or tests. The accreditation body usually defines the format and
other details.
Self-calibration – Self-calibration is a process performed by a user for
the purpose of making an IM&TE instrument or system ready
for use. The process may be required at intervals such as every
power-on sequence; or once per shift, day, or week of continuous
operation; or if the ambient temperature changes by a specified
amount. Once initiated, the process may be performed totally by the
instrument or may require user intervention and/or use of external
calibrated artifacts. The usual purpose is accuracy enhancement
by characterization of errors inherent in the measurement system
before the item to be measured is connected. Self-calibration is
not equivalent to periodic calibration (performance verification)
because it is not performed using a calibration procedure and does
not meet the metrological requirements for calibration. Also, if
an instrument requires self-calibration before use, then that will
also be accomplished at the start of a calibration procedure. Self-
calibration may also be called normalization or standardization.
Compare with: calibration (2.B) Contrast with: calibration (1)
Specification – In metrology, a specification is a documented statement
of the expected performance capabilities of a large group of
substantially identical measuring instruments, given in terms of
the relevant parameters and including the accuracy or uncertainty.

166
appendix: calibration terminology a to z 1

Customers use specifications to determine the suitability of a


product for their own applications. A product that performs
outside the specification limits when tested (calibrated) is rejected
for later adjustment, repair, or scrapping.
Standard (document) – A standard (industry, national, government, or
international standard; a norme) is a document that describes the
processes and methods that must be performed in order to achieve
a specific technical or management objective, or the methods for
evaluation of any of these. An example is ANSI/NCSL Z540-1-
1994, a national standard that describes the requirements for the
quality management system of a calibration organization and the
requirements for calibration and management of the measurement
standards used by the organization.
Standard (measurement) – A standard (measurement standard,
laboratory standard, calibration standard, reference standard; an
étalon) is a system, instrument, artifact, device, or material that
is used as a defined basis for making quantitative measurements.
The value and uncertainty of the standard define a limit to the
measurements that can be made: a laboratory can never have better
precision or accuracy than its standards. Measurement standards
are generally used in calibration laboratories. Items with similar
uses in a production shop are generally regarded as working-level
instruments by the calibration program.
Primary standard. Accepted as having the highest metrological
qualities and whose value is accepted without reference to other
standards of the same quantity. Examples: triple point of water cell
and caesium beam frequency standard.
Transfer standard. A device used to transfer the value of a
measurement quantity (including the associated uncertainty) from
a higher level to a lower level standard.
Secondary standard. The highest accuracy level standards in a
particular laboratory generally used only to calibrate working
standards. Also called a reference standard.
Working standard. A standard that is used for routine calibration
of IM&TE. The highest level standards, found in national and
international metrology laboratories, are the realizations or
representations of SI units. See also: calibration standard
Standard operating procedure (SOP) – A term used by some organizations
to identify policies, procedures, or work instructions.
Standard reference material – A standard reference material (SRM) as

167
appendix: calibration terminology a to z 1

defined by NIST “is a material or artifact that has had one or more
of its property values certified by a technically valid procedure,
and is accompanied by, or traceable to, a certificate or other
documentation which is issued by NIST… Standard reference
materials are…manufactured according to strict specifications
and certified by NIST for one or more quantities of interest.
SRMs represent one of the primary vehicles for disseminating
measurement technology to industry.”
Standard uncertainty – The uncertainty of the result of a measurement,
expressed as a standard deviation. (GUM, 2.3.1)
Standardization – See: self-calibration.
Systematic error – A systematic error is the mean of a large number of
measurements of the same value minus the (probable) true value
of the measured parameter. (VIM, 3.14) Systematic error causes the
average of the readings to be offset from the true value. Systematic
error is a measure of magnitude and may be corrected. Systematic
error is also called bias when it applies to a measuring instrument.
Systematic error may be evaluated by Type A or Type B methods,
according to the type of data available. Note: Contrary to popular
belief, the GUM specifically does not replace systematic error with
either Type A or Type B methods of evaluation. (3.2.3, note) See
also: bias, error, correction (of error) Compare with: random error
Test accuracy ratio – (1) In a calibration procedure, the test accuracy
ratio (TAR) is the ratio of the accuracy tolerance of the unit under
calibration to the accuracy tolerance of the calibration standard
used. (NCSL, page 2)

TAR =  UUT_tolerance
 STD_tolerance

The TAR must be calculated using identical parameters and units


for the UUC and the calibration standard. If the accuracy tolerances
are expressed as decibels, percentage, or another ratio, they must
be converted to absolute values of the basic measurement units.
(2) In the normal use of IM&TE items, the TAR is the ratio of
the tolerance of the parameter being measured to the accuracy
tolerance of the IM&TE. Note: TAR may also be referred to as the
accuracy ratio or (incorrectly) the uncertainty ratio.
Test uncertainty ratio – In a calibration procedure, the test uncertainty
ratio (TUR) is the ratio of the accuracy tolerance of the unit under

168
appendix: calibration terminology a to z 1

calibration to the uncertainty of the calibration standard used.


(NCSL, page 2)

TUR =  UUT_tolerance
 STD_uncert

The TUR must be calculated using identical parameters and units


for the UUC and the calibration standard. If the accuracy tolerances
are expressed as decibels, percentage, or another ratio, they must be
converted to absolute values of the basic measurement units. Note:
The uncertainty of a measurement standard is not necessarily the
same as its accuracy specification.
Tolerance – A tolerance is a design feature that defines limits within
which a quality characteristic is supposed to be on individual parts;
it represents the maximum allowable deviation from a specified
value. Tolerances are applied during design and manufacturing. A
tolerance is a property of the item being measured. Compare with:
specification, uncertainty
Traceable, traceability – Traceability is a property of the result of a
measurement, providing the ability to relate the measurement result
to stated references, through an unbroken chain of comparisons
each having stated uncertainties. (VIM, 6.10) Traceability is a
demonstrated or implied property of the result of a measurement to
be consistent with an accepted standard within specified limits of
uncertainty. (NCSL, pages 42–43) The stated references are normally
the base or supplemental SI units as maintained by a national
metrology institute; fundamental or physical natural constants that
are reproducible and have defined values; ratio type comparisons;
certified standard reference materials; or industry or other accepted
consensus reference standards. Traceability provides the ability
to demonstrate the accuracy of a measurement result in terms of
the stated reference. Measurement assurance methods applied
to a calibration system include demonstration of traceability. A
calibration system operating under a program controls system only
implies traceability. Evidence of traceability includes the calibration
report (with values and uncertainty) of calibration standards, but
the report alone is not sufficient. The laboratory must also apply
and use the data. A calibration laboratory, a measurement system,
a calibrated IM&TE, a calibration report, or any other thing is
not and be traceable to a national standard. Only the result of a

169
appendix: calibration terminology a to z 1

specific measurement can be said to be traceable, provided all of


the conditions just listed are met. Reference to a NIST test number
is specifically not evidence of traceability. That number is merely
a catalog number of the specific service provided by NIST to a
customer so it can be identified on a purchase order.
Transfer measurement – A transfer measurement is a type of method that
enables making a measurement to a higher level of resolution than
normally possible with the available equipment. Common transfer
methods are differential measurements and ratio measurements.
Transfer standard – A transfer standard is a measurement standard used
as an intermediate device when comparing two other standards.
(VIM, 6.8) Typical applications of transfer standards are to transfer
a measurement parameter from one organization to another, from
a primary standard to a secondary standard, or from a secondary
standard to a working standard in order to create or maintain
measurement traceability. Examples of typical transfer standards
are DC volt sources (standard cells or zener sources), and single-
value standard resistors, capacitors, or inductors.
Type A evaluation (of uncertainty) – Type A evaluation of measurement
uncertainty is the statistical analysis of actual measurement results
to produce uncertainty values. Both random and systematic error
may be evaluated by Type A methods. (GUM, 3.3.3 through
3.3.5) Uncertainty can only be evaluated by Type A methods if
the laboratory actually collects the data.
Type B evaluation (of uncertainty) – Type B evaluation of measurement
uncertainty includes any method except statistical analysis of
actual measurement results. Both random and systematic error
may be evaluated by Type B methods. (GUM, 3.3.3 through 3.3.5)
Data for evaluation by Type B methods may come from any source
believed to be valid.
Uncertainty – Uncertainty is a property of a measurement result
that defines the range of probable values of the measurand.
Total uncertainty may consist of components that are evaluated
by the statistical probability distribution of experimental data
or from assumed probability distributions based on other data.
Uncertainty is an estimate of dispersion; effects that contribute
to the dispersion may be random or systematic. (GUM, 2.2.3)
Uncertainty is an estimate of the range of values that the true value
of the measurement is within, with a specified level of confidence.
After an item that has a specified tolerance has been calibrated

170
appendix: calibration terminology a to z 1

using an instrument with a known accuracy, the result is a value


with a calculated uncertainty. See also: Type A evaluation, Type B
evaluation
Uncertainty budget – The systematic description of known uncertainties
relevant to specific measurements or types of measurements,
categorized by type of measurement, range of measurement, and/
or other applicable measurement criteria.
UUC, UUT – The unit under calibration or the unit under test – the
instrument being calibrated. These are standard generic labels for
the IM&TE item that is being calibrated, which are used in the text
of the calibration procedure for convenience. Also may be called
device under test (DUT) or equipment under test (EUT).
Validation – Substantiation by examination and provision of objective
evidence that verified processes, methods, and/or procedures are
fit for their intended use.
Verification – Confirmation by examination and provision of objective
evidence that specified requirements have been fulfilled.
VIM – An acronym commonly used to identify the ISO International
Vocabulary of Basic and General Terms in Metrology. (The acronym
comes from the French title.)
Work Instruction – In a quality management system, a work instruction
defines the detailed steps necessary to carry out a procedure.
Work instructions are used only where they are needed to ensure
the quality of the product or service. The level of education and
training of the people with the usual qualifications to do the work
must be considered when writing a work instruction. In a metrology
laboratory, a calibration procedure is a type of work instruction.

1. Bucher, Jay L. 2004. The Metrology Handbook. Milwaukee: ASQ


Quality Press.

171
PORTABLE CALIBRATORS

WORKSTATIONS PROFESSIONAL SERVICES

CALIBRATION SOFTWARE

172
about beamex

About Beamex
•  One of the world’s leading providers of calibration Why is Beamex better
solutions.
•  Develops and manufactures high-quality Accuracy assured
calibration equipment, software, systems and Accuracy is assured when you
services for the calibration and maintenance of decide to purchase a Beamex®
process instruments. calibrator. They are all delivered with
•  Certified in accordance with the ISO 9001:2000 a traceable, accredited calibration
quality standard. certificate.
•  Comprehensive product range includes portable
calibrators, workstations, calibration software, Integrated calibration solutions
accessories, professional services and industry- Beamex calibrators, workstations,
specific solutions. calibration software and professional
•  Products and services available in more than services form an integrated,
60 countries. More than 10,000 companies automated system.
worldwide utilize Beamex’s calibration solutions.
•  Customers from wide range of industries, such Industry pioneer with global
as automotive, aviation, contractor engineering, presence
education, food and beverage, manufacturing, A forerunner in developing high-
marine, metal and mining, nuclear, oil and gas, quality calibration equipment and
petrochemical and chemical, pharmaceutical, software, with global customer base
power and energy, and pulp and paper. and partner network.
•  For customers with requirements for accuracy,
versatility, efficiency, ease-of-use and reliability. High customer satisfaction
•  Beamex’s Accredited Calibration Laboratory Constantly improving understanding
is accredited and approved by FINAS (Finnish of customer needs and developing
Accreditation Service). FINAS is a member of all solutions to meet them.
Multilateral Recognition Agreements / Mutual
Recognition Arrangements (MLA/MRA) signed by Support
European and other international organizations, Installation, training, validation,
i.e. European co-operation for Accreditation system integration, database
(EA), International Laboratory Accreditation conversion, Help Desk and re-
Cooperation (ILAC) and International calibration services available.
Accreditation Forum Inc. (IAF).

173
w w w. b eam ex. c om

You might also like