You are on page 1of 28

FOUNDATION PRINCIPLES

CALIBRATION AND TEST EQUIPMENT


TOPIC 4 CONTENT

Certificate IV in Engineering (Instrumentation)


MEM40105 (WT46)

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 1 of 28
TOPIC 4 - CALIBRATION AND TEST EQUIPMENT
TABLE OF CONTENTS

1 INTRODUCTION ................................................................................................ 4
2 MEASUREMENTS.............................................................................................. 6
2.1 Read Internet Document ............................................................................ 6
2.2 Measurement Introduction ......................................................................... 6
2.3 Test Equipment Certification Against Standards ........................................ 6
2.4 National Measurement Institute (NMI) ........................................................ 6
2.5 National Association of Testing Authorities (NATA).................................... 7
2.6 Accuracy and Repeatability........................................................................ 8
2.7 Range Limits and Span .............................................................................. 9
2.8 Errors ......................................................................................................... 9
2.9 Calibration Zero Suppression and Zero Elevation .................................... 10
2.10 Display Suppressed Zero and Elevated Zero ........................................... 10
3 INSTRUMENT SPECIFICATIONS .................................................................... 11
3.1 Instrument Specification Introduction ....................................................... 11
3.2 Errors ....................................................................................................... 11
3.3 Instrument Error ....................................................................................... 12
3.3.1.1 Fault Finding Comment ................................................................ 15
3.4 Negative Value of LRL ............................................................................. 16
3.5 Instrument Pressure Limits ...................................................................... 16
3.6 Instrument Temperature Limits and Fill Fluid Selection ............................ 17
3.7 Read Internet Document .......................................................................... 17
3.7.1.1 Fault Finding Comment ................................................................ 17
3.8 Material Selection and Corrosion ............................................................. 18
3.9 Read Internet Document .......................................................................... 18
3.9.1.1 Fault Finding Comment ................................................................ 18
4 INPUTS AND OUTPUTS .................................................................................. 19
4.1 Inputs and Outputs Introduction ............................................................... 19
4.2 Field Instrument Inputs ............................................................................ 19
4.3 Transmitter Outputs ................................................................................. 19
4.4 Why a 4-20 mA Signal? ........................................................................... 20
4.4.1 4-20 mA ............................................................................................... 20
4.4.2 HART ................................................................................................... 20
4.4.3 Foundation Fieldbus............................................................................. 20
4.4.4 Wireless ............................................................................................... 21

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 2 of 28
4.4.5 Contacts............................................................................................... 21
4.5 Transmitter Power Supply ........................................................................ 21
4.5.1 24 Vdc Loop Powered 2 Wire............................................................... 21
4.5.2 24 Vdc Loop Powered 4 Wire............................................................... 21
4.5.3 240 Vac or 120VAC ............................................................................. 21
4.5.4 Battery ................................................................................................. 22
5 TEST EQUIPMENT .......................................................................................... 23
5.1 Read Internet Document .......................................................................... 23
5.2 Watch Internet Videos.............................................................................. 23
5.3 Test Equipment Introduction .................................................................... 23
5.4 HART and Fieldbus Communicator .......................................................... 24
5.5 Pressure Source ...................................................................................... 24
5.6 Temperature Source ................................................................................ 24
5.7 Level Source ............................................................................................ 24
5.8 Flow Source ............................................................................................. 25
5.9 Gas Detectors and Analysers .................................................................. 25
6 CALIBRATION .................................................................................................. 26
6.1 Read Internet Document .......................................................................... 26
6.2 Calibration Introduction ............................................................................ 26
6.3 Reasons for Performing a Calibration ...................................................... 26
6.4 Calibration Steps ..................................................................................... 27
6.4.1 Step 1 Get Required Calibration Range ............................................... 27
6.4.2 Step 2 Select Correct Test Equipment ................................................. 27
6.4.3 Step 3 Determine Best Location for Calibration .................................... 27
6.4.4 Step 4 Follow the Manufacturer‟s Manual to Calibrate ......................... 28
6.4.5 Step 5 Verify the Calibration................................................................. 28
6.4.6 Step 6 Document the Calibration.......................................................... 28

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 3 of 28
1 INTRODUCTION
The purpose of Topic 4 Calibration and Test Equipment is to create a foundation of
knowledge on which the subsequent courses can further built upon to create the level
on knowledge required to successfully complete the Certificate IV in Engineering
(Instrumentation).
This topic starts with the principles of measurement. If you understand the principles
of measurement, it will assist in understanding the benefits and limitation of
calibration. It is common for an instrument technician to have to select a replacement
transmitter either from site warehouse or by ordering new. In the instrument
specification section, all published errors of a typical instrument are calculated. The
technician is advised that the installation can effect how the calibration should be
performed. The best method to ensure stable operation is in the original instrument
selection. To this end, transmitter range selection, fill fluids and material corrosion
are introduced.
Calibration is the function of adjusting the instrument output so that it represents the
required process variable input. The instrument technician is exposed to the various
types of instrument outputs and power supplies. Test equipment and common steps
in calibration are then discussed.
This topic uses a typical Rosemount pressure transmitter to discuss the principles,
terms, and key points that apply to all instruments. Challenger Institute of Technology
is not advising that preference should be given to Rosemount or that Rosemount is a
superior product to other manufactures. Rosemount was simply selected because it
has extensive reference material on its web site and Rosemount is available
internationally.
Within this topic, we will be referencing several Rosemount documents and
applications. These documents/applications are available free of charge on the
Emerson Rosemount web site. The full website path is provided within the reading
table section.
http://www.emersonprocess.com/rosemount/document/index.html
Rosemount Documents Used:
 Rosemount 3051S Product Data Sheet 00813-0100-4801 Rev KA April 2008.
When reading the Rosemount data sheet please take note that the primary
Units are inH20 and bar is used in the metric conversion but they use a
comma rather than a period to represent the decimal point.
 Rosemount 1199 Fill Fluid Specifications 00804-2100-4016 Rev AA
 Rosemount Technical Data Sheet Corrosion and Its Effects 00816-0100-3045
Rev CA May 2003
 Rosemount 375 Field Communicator Users Manual March 2007
 Rosemount 3051S HART Reference Manual 00809-0100-4801 Rev CB
January 2007. There are separate manuals for fieldbus and wireless outputs.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 4 of 28
Rosemount Video Applications Files Used:
 Rosemount 375 Basic Setup HART
 Rosemount 375 Basic Setup Fieldbus
 Rosemount 375 Zero Trim HART
 Rosemount 375 Segment Health Fieldbus
 Rosemount 375 Configure Temp Sensor
 Rosemount 375 Basic Level
 Rosemount 375 Tank Geometry
 Rosemount 375 Echo Tuning

Good luck on this topic. If you have any questions, please contact your assigned
lecturer.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 5 of 28
2 MEASUREMENTS
2.1 Read Internet Document

Read Document Product Data Sheet 00813-0100-4801 Rev KA April 2008

Company Rosemount

http://www.emersonprocess.com/rosemount/document/pds/480
Website
1b00n.pdf

Pages 1-19 and 31-35

Note that the primary units are inH20 and bar is used in the
Comment metric conversion but they use a comma rather than a period to
represent the decimal point.

2.2 Measurement Introduction

Any calibration you perform is only as good as the calibration and test equipment you
used to perform the calibration. If the test equipment has an error, then the calibrated
instrument will have the same error as the test equipment.

2.3 Test Equipment Certification Against Standards

Test equipment needs to be tested occasionally against a known standard to ensure


it is operating to specifications and within stated accuracy. In Australia, the National
Measurement Institute (NMI) has the highest standards.
The test equipment used as a measurement standard also called working standard at
your plant should be sent to a third-party accredited test lab for certification as per
your plant procedures. Certified test equipment should be treated with greater care
then general test equipment.
Not all test equipment at your instrument shop will be sent out for certification. Some
calibration equipment will be tested at your plant against your certified test
equipment. Some instruments when purchased will come with a certification of
calibration from the manufacturer.
If instruments are installed at your plant by a third party contractor, it is typically a
contract requirement that any calibrations performed by the contractor come
complete with calibration certificates. The calibration certificate will state the test
equipment used. In addition, the test equipment would require a certificate stating
when last calibrated and against the traceable standard. It is not uncommon to
require certification on some test equipment every six months.

2.4 National Measurement Institute (NMI)

The National Measurement Institute (NMI) is part of the Australian Government


Department of Innovation, Industry, Science and Research. The NMI provides
calibration services traceability to SI units of measurement for approximately $250
per hour. The NMI standards are the highest accuracy measurements available in
Australia. The NMI calibration services related to instrumentation are load cells,
manometers, pressure gauges, vacuum gauges, pressure transducers, positive
displacement meters, resistance thermometers, thermocouples, density, and
hygrometry. The NMI web site is http://www.measurement.gov.au.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 6 of 28
2.5 National Association of Testing Authorities (NATA)

The National Association of Testing Authorities (NATA) is a not-for-profit national


laboratory accreditation authority approved by the Australian Government to provide
accreditation for laboratories and testing facilities. The NATA accreditation means the
test facilities is competent in testing, measurement, inspection and calibration as
stated within their accreditation. The NATA also provides accreditation for producers
of certified reference materials, training, information, and support services. The
NATA web site is http://www.nata.asn.au

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 7 of 28
2.6 Accuracy and Repeatability

A measurement is only as good as the design limitation of the instrument. An


instrument with an accuracy of 10% and a repeatability of 1% will most likely
repeatedly give consistence indication of the wrong process value. This is over
simplification but a good starting point to introduce the concepts. In instrumentation,
we use repeatability rather than precision.

Example
Example Tank Level Target Shooting

Accuracy means correct.


A measurement is accurate if the transmitter output
represents the actual process variable.
If the tank level was 45%, an accurate level
transmitter would measure 45%.
Accuracy

Precision means having repeatability in reproducing


the same results.
A level transmitter that consistently states the tank
level is 50% when it is actually 45% has precision but
not accuracy.
Precision

Calibration can improve accuracy.


The level transmitter accuracy should be improved by
calibration.
This is similar to adjusting your scope in target
shooting which can make a precision shooter more
Accuracy with
accurate.
Precision

No Accuracy and No Precision


If the level transmitter has neither accuracy nor
repeatability, then it should just be replaced.
A target shooter with neither accuracy nor precision
may improve with practice but a bad transmitter never
will. Random - Bad Shooter

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 8 of 28
2.7 Range Limits and Span

Each instrument will have specified range limits that the instrument is designed to
operate within. The upper range limit (URL) is the highest quantity that a device can
be adjusted to measure and the lower range limit (LRL) is the lowest quantity that a
device can be adjusted to measure. Range is the value between the LRL and URL
that the instrument is capable of measuring. The product data sheet will usually state
errors as a percentage of URL, span, or reading.
Span is the algebraic difference between the upper and lower range values. For
example, Range = -40C to 200C, Span = 240C, Range = 4-20mA, Span = 16mA.
It is important that you understand how to read and apply information in a
manufacturer product data sheet to select the correct instrument. Using the
Rosemount 3051S Product Data Sheet, lets look at what range should be selected
for a differential pressure transmitter with a calibration of -70 to 50 kPa. The
calibrated span would be 120 kPa. When selecting an instrument range you may not
exceed either the LRL or the URL, or have a calibrated span less than the minimum
instrument span.
For Example:

Specification Range 2 Range 3 Need Comments

LRL -62 kPa -249 kPa -70 kPa Actual LRL is >Range 2.
Range 3 OK

URL 62 kPa 249 kPa 50 kPa Either Range 2 or 3 OK

Instrument 0.623 kPa 2.49 kPa 120 kPa Either Range 2 or 3 OK


Minimum Span

The correct selection would be a Range 3 transmitter since a Range 2 transmitter


does not have a LRL within calibration range of the instrument.

2.8 Errors

In statics, errors are additive. The sum of all published errors is the error that is
stated as either +-% or +- units to the true value and does not depend on the true
value itself. In other words, the statical result of the measurement is considered as
the true value +- calculated error. In the Instrument Specification section, the actual
errors of a common pressure transmitter will be examined in detail.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 9 of 28
2.9 Calibration Zero Suppression and Zero Elevation

Zero suppression is when a pressure transmitter is mounted below the high side
process connection and the pressure in the impulse tubing wet leg, or remote seal
causes a pressure to be measured by the transmitter even though the tank or
process line is empty. This pressure is calibrated out so the output of the transmitter
represents the true process variable.
Zero elevation is when a pressure transmitter is mounted above the high side
process connection or when there are two remote seals. The transmitter sees
negative pressure even though the tank is empty. The transmitter is sensing the head
pressure of the capillary fill fluid. This pressure is calibrated out so the output of the
transmitter represents the true process variable.
In the next course Transmitters & Valves, we will be discussing remote seals,
elevation and suppression in detail. For now, the intent is to make you aware why
calibration might be -6.4 to 13.6 kPa or 7 to 27 kPa and not a standard 0-20 kPa for a
tank with
0-100% range on the graphic display.

2.10 Display Suppressed Zero and Elevated Zero

It is important not to get pressure calibration zero suppression and zero elevation
confused with a display suppressed zero and elevated zero. A field mounted gauge
or other type of display is said to have a suppressed zero when the display lower
range of the measured variable is greater then zero for example 30 to 45C. Elevated
zero is when the display lower range value is less than zero for example -40 to 100C.
A fever temperature thermometer might be 150mm long and have a display of
30 to 45 C. The reason for having a limited display range is so that you can read the
thermometer with a greater resolution due to the marked scale covering a limited
span of 15C. There may be only one degree C per scale markings. If the same
thermometer had display of -40 to 100C; you most likely would not be able to
distinguish on the scale if temperature was 38 or 39.
Increased resolution does not necessarily mean increased accuracy of the primary
sensing element. It does mean it is easier to read the display and distinguish
between display values.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 10 of 28
3 INSTRUMENT SPECIFICATIONS
3.1 Instrument Specification Introduction

A manufacturer product data sheet in the specification section will usually state
specifications for accuracy and repeatability or a combined error total value.
Sometimes each error specification is stated separately. There will be measurement
errors from instrument accuracy, hysteresis, repeatability, temperature effect,
process pressure effect, mounting effect and indicator etc. The instrument range and
minimum span is usually stated along with instrument design pressure and
temperature limits. The output, materials of construction, and electrical ratings will
also be stated on the manufacturer product data sheet.

3.2 Errors

To understand calibration you will need to understand the limitation of the instrument
you are going to calibrate. It is important for an instrument technician to know how to
read a manufacturer product data sheet. It is not acceptable to blindly calibrate with
the information stated in the instrument data sheet. You should stop and think; does
this make sense? An error may have been made at the original engineering stage or
the instrument may have been changed out with another that has different
specifications than original design. It is important that you understand that the
instrument selected will impact on the total error of the instrument. Remember errors
come in many forms, human calculation error in design, instrument accuracy,
instrument selection error, actual application can cause errors, an old instrument can
start to have errors, and there could be programming errors.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 11 of 28
3.3 Instrument Error

The best way to understand the errors that an instrument may have is by looking and
discussing using an actual manufacturer‟s data sheet. For this example, we are using
the Rosemount 3051S Product Data Sheet. When reading the Rosemount data sheet
please take note that the primary units are inH20 and bar is used in the metric
conversion but they use a comma rather than a period to represent the decimal point.
Example below is using the following model number.
Description: Rosemount Pressure Transmitter, Classic, Coplanar Connection,
Differential Measurement, Pressure Range 3 (-1000 to 1000 inH2) = (-2.49 to 2.49
bar) = (-249 to 249 kPa), Minimum Span 2.49 kPa, with 316L SST Isolating
Diaphragm. The model number would be 3051S2CD3A2.
On page 4, the term rangedown is used. Rangedown is the ratio of:
(URL-LRL) / Minimum Span. For example a range 3 Classic has a URL of
1000 “H2O, a LRL of -1000 “H2O, and a minimum span of 10 “H20. The rangedown
is (1000 - -1000)/10 = 2000/10 = 200 expressed as 200:1.
On page 5 under reference accuracy there are different accuracy calculations
depending on the span ratio. This means (URL-LRL) / Calibrated Span. The
instrument span is 498 kPa. For the examples below, we are using a calibrated span
of 20 kPa the span ratio is 498/20 = 24.9:1 and 200 kPa span ratio is 498/200 =
2.49:1.
On page 6 is the Total Performance which is the combined errors for reference
accuracy, ambient temperature, and line pressure effect. I have not included below
since it would duplicate the individual values and it is not inclusive of all published
errors.
Statically errors are accumulative. It does not mean a measured value will have the
calculated error only that the instrument could have the calculated error and still be
within specifications. The calculations below are made using calibrated range in
place of span. The reason for this is that is the method stated for digital outputs and
the error values are less with using calibration range rather and instrument span.
To assist in the discussion after this example, the calculations were also performed
for a Range 2 transmitter. The results are stated in brackets at bottom of each 20
kPa calculations were the calculation has a different value then the Range 3
calculation.
Calculations are on next two pages.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 12 of 28
Page Type of Stated Calculation Calculation
of Error Values for Calibrated span of Calibrated span of
Data 3051S2CD3A 20 kPa with 200 kPa with
Sheet Range 3 Range 3 Transmitter
Transmitter

5 Reference +- 0.055% of =((0.055)/100)(20) =((0.015 +


Accuracy span ((0.005)(249/200))/100)
= +- 0.011 kPa
includes For spans (200)
terminal <10:1
based =+- 0.04245 kPa
linearity, +-(0.015+)
hysteresis .(0.005(URL/ (Range 2
and span))% of +- 0.0061 kPa)
repeatability span

6 Long Term +- 0.125% of =0.125/100(249) =0.125/100(249)


Stability URL for 5
=+- 0.31125 kPa =+- 0.31125 kPa
years for
+-28C temp
change and (Range 2
6900 kPa line +- 0.0775 kPa)
pressure

7 Ambient per 28C =0.025/100(249) + = 0.0125/100(249) +


Temperature +- (0.0125% 0.125/100(20) 0.0625/100(200)
Effect URL +
=+- 0.08725 kPa = +- 0.156125 kPa
0.0625%
per 28C per 28C
span from
1:1 to 5:1
+- (0.025%
URL +
0.125% span
from > 5:1 to (Range 2
100:1 +- 0.02025 kPa)

8 Line +- 0.05% URL =0.05/100(249) =0.05/100(249)


Pressure per 6900 kPa
=0.1245 kPa =0.1245 kPa
Effect Zero
Error * can
be calibrated (Range 2
out +- 0.031 kPa)

8 Line +- 0.1% of =0.1/100(20) =0.1/100(200)


Pressure reading per
=+- 0.02 kPa =+- 0.2 kPa
Effect Span 6900 kPa
at full scale at full scale
Error

continued on next page

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 13 of 28
8 Mounting Zero shift = +- 0.311 kPa = +- 0.311 kPa
Position +- 0.311 kPa
Effect can be
calibrated out

8 Vibration +- 0.1% of =0.1/100(249) =0.1/100(249)


Effect URL
=+- 0.249 kPa =+- 0.249 kPa
(Range 2
+- 0.062 kPa)

8 Power +- 0.005% of = 0.005/100(20) = 0.005/100(200)


Supply calibrated
= +- 0.001 kPa = +- 0.01 kPa
Effect span per volt
per volt per volt

Total +- 1.115 kPa +- 1.404 kPa


or 5.575% or 0.702%
(Range 2 +- 0.523
kPa or 2.615%)

What does this Error Calculation mean? For the transmitter at full scale reading,
with temperature variation less than 28C, line pressure less than 6900 kPa, and a 24
volt power supply with less than 1 volt variation the total statistical error for the
transmitter calibrated at 20 kPa is +- 5.6% and at
200 kPa is +- 0.7%. This does not mean the actual error will be these values just that
the sum of all instrument published errors are equal to this value.
Why you should care? The range 3 transmitter can be calibrated for spans from
2.49 kPa to 498 kPa if there is a suppressed zero. The range 2 transmitter can be
calibrated for spans from 0.623 kPa to 124 kPa if there is a suppressed zero. It is
good practice to select a transmitter were your calibrated range is near the URL
rather than the minimum span. For the calibrated span of 20 kPa, using a range 2
transmitter would statistically give a better reading with +- 2.615% rather than +-
5.575% with a Range 3.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 14 of 28
3.3.1.1 Fault Finding Comment

 Select transmitter near the URL rather than minimum span. This applies to all
transmitters not just pressure transmitters.
 If you are getting calibration drift between summer and winter, look at the ambient
temperature effect of the installation.
 For high line pressures, remember to calibrate out the zero error effect.
 Remember to calibrate the transmitter in the same position that it will be mounted
in the field.
 Mount the transmitter remotely if the line/equipment has vibration problems.
 If using a field mounted 24VDC power supply, ensure the specified power supply
has minimal voltage drift.
 Know how to read a product data sheet, understand basic principals of pressure,
flow and temperature, be logical when something does not look right and you will
have the care required to be a successful instrument technician.
 Be careful a data sheet has both imperial and metric units. Rosemount uses bar
for their metric conversion but they use a comma rather than a period to
represent the decimal point. If in doubt, convert the imperial units yourself to
determine the correct metric value. It may appear obvious to do a double check
on units if in doubt but common sense does not always rule. Logically we are
programmed to accept whatever the manufacturer publishes.
 Actual Case: In the early 1990‟s, when Rosemont started publishing both
imperial and metric units they made a mistake on the metric conversions of the
pressure transmitter data sheet and all metric values were out by one digit. You
might not think this would make much of a difference but at the plant I was at they
ordered a lot of transmitters based on that data sheet. Needless to say, the
transmitters actually were useless for the required application.
 Actual Case: I was at a plant without a pressure transmitter on a stack. There
was a process upset and chemicals were discharged to atmosphere resulting in
the employee‟s cars being coated with a chemical that ate the paint. The
company had to pay to repaint numerous cars. This happened more then once
and the government got involved. The hasty resolve was to put a pressure
transmitter on the stack interlocked to shut-down process if the situation arose
again. The chemical would eat through 316SS in about 6 weeks and we could not
get a transmitter with the correct senor material for 8 weeks. The pressure
reading was low range and we had an option of using an absolute pressure
transmitter, a draft range transmitter, or a possibly range 1 or 2 if available in
warehouse. The question put through to the instrument mechanics and myself
was what can we do NOW? The resolution was to use a transmitter from site
warehouse with a 316SS capsule and change out if it was destroyed while
waiting for correct materials. The next question is what range of transmitter would
give the most accurate reading? Using a transmitter with a higher accuracy
specification near the minimum span or using a less accurate transmitter but near
the URL. By understanding the limitations of a transmitter and being able to
calculate the actual accuracy, the best selection was able to be made.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 15 of 28
 Actual Case: At another job, the instrument engineer just ordered all level
transmitters with the same range. Unfortunately, the range he selected had a
minimum span greater than actual calibration required for every transmitter.
When the plant instrument technician went to calibrate, the problem was found. It
should not happen but it does.

3.4 Negative Value of LRL

Look at page 9 of the Rosemount 3051S Product Data Sheet. The instrument range
is listed under Functional Specifications. You will note that the Lower Range Limit
(LRL) is negative except for the absolute transmitter which 0 Bar. The eighth
character in the model number states what type of pressure transmitter it is. A
differential pressure transmitter has code „D‟, gauge code „G‟, and absolute code „A‟.
The differential pressure transmitters (DP Xmitter) have the same value for URL as
for LRL. The gauge transmitters have the same or a less value for LRL compared to
the URL. The main difference is the LRL has a negative sign. With a DP Xmitter, the
sensor capsule is the same both sides. To measure vacuum sometimes a DP Xmitter
is used with the process connected to the low port. The reason for using the
abbreviation DP Xmitter is for you to become familiar with the common ways of
writing in instrumentation.
The reason the LRL is required to be negative is sometimes the transmitter has
remote seals or a wet leg and the pressure created by the remote seal fluid must be
calibrated out so only the actual process pressure is measured. For example
suppose the pressure transmitter with remote seals on a tank required a calibration
of 0% tank level -72 kPa and 100% tank level is 20 kPa. The calibration would be -72
to 20 kPa. The span would be 92 kPa. To select the correct range transmitter the
LRL must be > -72 kPa. Although a Range 2 transmitter has a span of 124 kPa, a
Range 2 would not work since the LRL is only -62 kPa and we need -72 kPa.
Therefore, a Range 3 transmitter must be used for this application.

3.5 Instrument Pressure Limits

Generally, the design pressure or overpressure limit will be at least 1.5 times the
Upper Range Limit (URL). If the instrument has flanges, then the instrument rating
will be at least equal to the ANSI pipe class or the DIN PN pressure rating. The
Rosemount 3051S Product Data Sheet on page 11 and 12 state the design pressure
of the transmitter.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 16 of 28
3.6 Instrument Temperature Limits and Fill Fluid Selection

The instrument process temperature limits need to be confirmed if replacing an


existing instrument or specifying a new instrument. For a pressure transmitter, it is
usually the type of sensor capsule fill fluid and type of process connection that
determines the temperature limits. The process temperature and type of process fluid
need to be looked at to determine what fill fluid should be used. For example oxygen,
halogens, chlorine and other chemicals require an inert fill to be used as silicon fill
can cause an explosion. If you are changing out a transmitter then be careful that the
replacement transmitter fill fluid is rated for your application.
Most applications can use the standard silicon fill fluid but for very hot or cryogenic
service then you need to check the type and rating of transmitter fill fluid used.
Rosemount 3051S Product Data Sheet on page 12 list temperature ratings. The
reading below will assist in understanding the various types of fill fluid, applications
and temperature pressure charts.

3.7 Read Internet Document

Rosemount 1199 Fill Fluid Specifications 00804-2100-4016


Read Document
Rev AA
Company Rosemount

http://www.emersonprocess.com/rosemount/document/notes/0
Website
0840-2100-4016.pdf

Pages 1-9

Comment Use in the assignment section to use to answer questions.

3.7.1.1 Fault Finding Comment

 Do not just blindly use one company standard fill fluid at another plant.
 Actual Case: An instrument engineer specified all transmitters for a low cost pulp
mill in South America with normal design conditions to have special fill fluid in all
sensors. The reason was when he worked for an oil company that was the
company standard. The problem was the application is different, it would be
difficult to source that fill fluid in South America, the fill fluid was rated for cold
climates and it was always warm there, and he did not know that chlorine would
exploded if it came in contact with the fill fluid. All around this was a bad
application. Just because you are familiar with one application does not mean it is
right for another. Ultimately, the guy was fired. He just would not look beyond his
limited window of experience.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 17 of 28
3.8 Material Selection and Corrosion

The Rosemount 3051S Product Data Sheet on page 31 states various isolating
diaphragm materials and flange materials. A pressure transmitter diaphragm, flange
and vent will be in contact with the process fluid. Parts in contact with the process
fluid are referred to as wetted part. It is important that an instrument technician
understands that any material in contact with the process fluid must be compatible
with that fluid. Many instrument companies will provide corrosion tables for the
common materials and applications they sell to. Depending on the size of your plant,
it is common for the company to have a document or chart of material compatibilities
for the process fluid at your plant.
To be a successful instrument technician you must understand corrosion and its
effects.

3.9 Read Internet Document

Technical Data Sheet Corrosion and Its Effects 00816-0100-


Read Document
3045 Rev CA May 2003

Company Rosemount
http://www.emersonprocess.com/rosemount/document/pds/corr
Website
ode.pdf

Pages 1-15

Comment Use in the assignment section to use to answer questions.

3.9.1.1 Fault Finding Comment

 If you have to specify a replacement instrument and do not have access to


corrosion tables, one way to select the correct materials is to look at the existing
instrument data sheet or read the model number from the instrument and look at
the manufacturer data sheet to determine materials.
 If you need to specify a new instrument, look at an instrument in the same or
similar service and see what materials are stated on that instrument data sheet.
 If the instrument has a history of failure or the material in contact with the process
fluid looks pitted, then discuss with your supervisor what might be a better option
in material selection for the replacement instrument.
 If looking at other instruments to help determine material compatibility, be careful
that you are considering the thickness of the material in contact with the process
fluid. For example, the body of a valve is much thicker that a pressure transmitter
diaphragm. The valve might just pit and last for years the diaphragm being very
thin might be eaten out in weeks.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 18 of 28
4 INPUTS AND OUTPUTS
4.1 Inputs and Outputs Introduction

In this section, the various types of inputs into a transmitter, common transmitter
outputs, and common power supply requirements are discussed.

4.2 Field Instrument Inputs

The input into a field instrument will depend on what type measurement is being
made. In the case of pressure, temperature, flow, analytical, and density, the
instrument is in contact with the process fluid. Level measurements may make
contact with the process fluid or be mounted above the vessel. Vibration, weight
scale, speed, and limit switches are mounted on the equipment and do not make
contact with the process variable. Current and power are in the switchboard. Gas
detectors, ambient air temperature, humidity, and local panels are mounted in the
building/room. This list is not inclusive of all instruments in a plant but does provide a
general guide as to the various types of inputs into sensors/transmitters.
All instruments will have some type of sensor either remote mounted or within the
instrument. In the case of the pressure transmitter discussed earlier, the sensor is in
the diaphragm seal contained within the transmitter.
For a remote mounted separate sensor, the sensor is hard wired to the transmitter.
Without both the sensor and the transmitter functioning, there will not be a valid
transmitter output. Common separate sensors are required for temperature, pH, and
some magmeters.
A temperature transmitter usually has a separate RTD or thermocouples sensor
mounted in a thermowell. The thermowell makes contact with the process fluid and
the temperature sensor can be removed without causing a process disturbance. The
transmitter to convert the sensor signal to a standardised signal output will usually be
remote mounted for hot or inaccessible locations. In accessible locations, it is
common to have the temperature sensor mounted on the thermowell and the
transmitter mounted on the sensor body. The complete unit is called a temperature
assembly. Typical instrument tags maybe TW for the thermowell, TE for the sensor,
and TT for the transmitter.
In Transmitters & Valves, transmitters will be discussed in detail.

4.3 Transmitter Outputs

The input into a transmitter is what is being measured. The output from a transmitter
will be a signal that is compatible with the control system. In the pre 1950‟s, most
transmitter outputs were a pneumatic 3-15 psi signal. Australia, UK, New Zealand,
Canada and many other countries had not converted to metric at that time. If you
work in an old plant, there may still be some pneumatic transmitters in operation.
After pneumatics came electronic, the 4-20 mA output signal dominated the market.
With modern control systems, the transmitter output may be 4-20 mA,
4-20 mA with a digital signal on HART protocol, Foundation fieldbus protocol, or
wireless. See page 32 of the Rosemount 3051S Product Data Sheet, for an example
of how a transmitter output is specified. The Rosemount transmitter model 1151 also
offers a 10-50 mA, 0.8-3.2 Vdc, and 1-5Vdc outputs.
Although there are various types of outputs available from transmitter, in this course
when introducing basic principles we will use the 4-20 mA signal for explanation.
Since it is easier to describe and understand. If you can understand the principles,

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 19 of 28
then understanding instruments with different outputs will be easier. In the later
courses, other output types will be discussed in detail.

4.4 Why a 4-20 mA Signal?

The reason the output signal is 4-20 mA is because when there is a fault in the
instrument, signal line, or rack, the signal should drop out to 0 mA or some value less
than 4 mA. Most control systems are programmed so that if the signal is less that 3.5
mA than an instrument fault alarm is raised and an instrument technician will be call
to correct the problem. If the standard signal was 0-16 mA, then there would be no
way of determining a problem in the instrument, signal line, or rack. The cable could
be cut and the control system would not know.
Most control systems also have an instrument fault alarm when the signal is over 21
mA or some other nominated plant value.

4.4.1 4-20 mA

The transmitter output signal 4-20 mA is direct current (dc). In instrumentation, the dc
is usually not stated just implied.
The typical wiring for a 4-20 mA output is a twisted shielded pair 18 AGW or 1.5mm2.

4.4.2 HART

A transmitter with HART output signal has a digital signal riding on the transmitter
4-20 mA output signal. This is similar to having the internet digital signal on your
home phone line. You can talk and use the internet at the same time. The digital and
the analogue signal share the wires but are independent of each other.
The Highway Addressable Remote Transducer (HART) is a bi-directional master-
slave field communications protocol used to communicate between intelligent field
instruments and the control system or a handheld device. The HART protocol makes
use of the Bell 202 Frequency Shift Keying (FSK) standard to superimpose digital
communication signals at a low level on top of the 4-20mA.
For additional information, see HART Communication Foundation
http://www.hartcomm2.org
For a HART output, the recommended wiring is a twisted shielded pair not to exceed
1,500 metres or 5000 ft. The wire size is typically 18 AGW or 1.5mm2 but should be
between 24 AWG to 14 AWG.

4.4.3 Foundation Fieldbus

A transmitter with Foundation Fieldbus only has a digital output. This is similar to
having computers networked and communicating via network cables.
In 2008, a fieldbus transmitter costs about 10% more than a traditional transmitter.
The Foundation Fieldbus is a communications protocol for control and
instrumentation systems in which each device has its' own intelligence and
communicates via an all-digital, serial, two-way communications system.
For additional information, see Fieldbus Foundation http://www.fieldbus.org
For a Foundation Fieldbus output, the recommended wiring is a twisted shielded pair
18 AGW or 1.5mm2.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 20 of 28
4.4.4 Wireless

A transmitter with wireless output only has a radio frequency signal. The transmitter
is similar to your wireless modem at home were the computer has either an
interna/external device to receive the signal. Unlike your modem at home for a
wireless transmitter to be useful, it is required to be battery operated.
In 2008, a wireless transmitter costs about twice as much as a traditional transmitter.
Wireless transmitters require a gateway to receive the signal transmission. The
wireless signals are required to ensure that they comply with the country regulations
for Radio Frequency (RF) spectrum.

4.4.5 Contacts

For switches, the output is a set of one or more contacts. Switches are digit inputs
(DI) to a control system. Calibration is also performed on switches. The calibration of
a switch is similar to a transmitter except rather than have a calibrated range the
switch will have a setpoint. The switch will be set so that the contacts open or close
as the process variable either rises or falls to the required setpoint. Switches will be
discussed in detail in Transmitters & Valves.

4.5 Transmitter Power Supply

For typical transmitters, there are basically four main types of power to the
transmitter. These are 24 Vdc loop powered, 24 Vdc 4 Wire, mains power, and
battery and are discussed below. There are other voltages used for some speciality
instruments but the 24Vdc loop power is the most common.

4.5.1 24 Vdc Loop Powered 2 Wire

The traditional loop powered 2 wire instrument would only have two wires connected
to the instrument. The wires supply the 24 Vdc power to the transmitter and have the
output signal. The cable is typically a twisted shielded pair 18 AGW or 1.5mm2 with
white and black colour coding on the cores.

4.5.2 24 Vdc Loop Powered 4 Wire

There are some transmitters that require a 4 wire connection but these are not
common. One pair of wires supply the 24 Vdc power to the transmitter and the other
pair has the output signal. The cable is typically a 2 pair, twisted shielded pair 18
AGW or 1.5mm2 with white and black colour coding on the cores.

4.5.3 240 Vac or 120VAC

Some instruments require a 240 Vac or 120 Vac power supply. The need for
additional power may be due to sensor requirements or the programming/display
functions within the transmitter. Analytical and magmeters are common examples
although both are available as 2 wire in some applications. It is common for the
power cable to be under the control of the electrical department and the output signal
cable under the control of the instrument department.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 21 of 28
4.5.4 Battery

For a wireless transmitter to be truly wireless, it will require a battery for the power
supply. The Rosemount 3051S Wireless Transmitter uses two “C” size primary
lithium/thionyl chloride batteries.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 22 of 28
5 TEST EQUIPMENT
5.1 Read Internet Document

Read Document 375 Field Communicator Users Manual March 2007

Company Rosemount

Website http://www.fieldcommunicator.com/suppmanu.htm

Pages All of Sections 3 and 4

Comment Just lightly read.

5.2 Watch Internet Videos

 Rosemount 375 Basic Setup HART


 Rosemount 375 Basic Setup Fieldbus
 Rosemount 375 Zero Trim HART
 Rosemount 375 Segment Health Fieldbus
Watch Videos
 Rosemount 375 Configure Temp Sensor
 Rosemount 375 Basic Level
 Rosemount 375 Tank Geometry
 Rosemount 375 Echo Tuning

Company Rosemount

Website http://www.fieldcommunicator.com/375InAction.htm
Executable application files. They only take a few minutes each
Comment
and will assist in your understanding of the subject matter.

5.3 Test Equipment Introduction

To calibrate an instrument you will require test equipment. Continuing from the
Instrument Specification section, we will be using the Rosemount 375 Field
Communicator Manual as a reference document. The functions of most
communicators will be the same from different manufacturers but the housing and
buttons will be different. If you can understand the operations of one communicator,
you should understand the operations of all communicators.
Below are examples of test equipment and methods of testing that can be used to
validate the calibration of an instrument. If not sure of the correct method of
operation, care, or handling for any test equipment, reference the manufacturer‟s
manual and your company procedures.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 23 of 28
5.4 HART and Fieldbus Communicator

If your plant uses HART or fieldbus, then you should have a field communicator or
handheld terminal to interface with the field instruments. The same communicator
can usually do both types of protocols.
In the Rosemount 375 Manual, reference page 3-9 and 3-10 for connections required
for a HART loop and page 4-7 and 4-8 for a fieldbus loop.
If using the handheld communicator in the field on a fieldbus loop, read the warning
on page 4-8. The Field Communicator will draw 17 mA from the fieldbus segment. If
the segment is already near capacity, then the communicator may draw too much
current resulting in a loss of communication.

5.5 Pressure Source

To truly verify a pressure calibration you would need to apply pressure to the
pressure transmitter. The pressure source must have a pressure display on/from the
source with the required accuracy to validate the actual pressure being applied to the
transmitter. Your instrument shop may have dead weight testers, manometers, high
accuracy pressure gauges on compressed air line, vacuum pump, handheld puffer,
or some other 3rd party traceable pressure standard.

5.6 Temperature Source

To confirm calibration of a temperature transmitter a handheld temperature simulator


is used. Since the values of various types of thermocouples and RTDs are known,
the electronics for a simulator are fairly simple. Some temperature transmitters now
have smart programming to reduce the temperature sensor inter-changeability errors.
To test a temperature assembly comprising of the temperature sensor and
transmitter wired/mounted together, then a temperature ice bath and/or heat source
with a high accuracy thermometer is typically used.

5.7 Level Source

If it is possible to view the tank level from a maintenance hatch or from an open tank,
the actual tank level can be measured and compared to the control system display.
This is rarely done but is an option if required. I have had to resort to this method on
several occasions.
When a pressure transmitter is used for level measurement, then the transmitter is
calibrated like other pressure instruments.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 24 of 28
5.8 Flow Source

Flow is a more tricky measurement to verify. In some applications, the flow of fluid
from a tank can be used to validate the calibration. A tank is filled and height
recorded and no new flows are permitted to enter the tank during the test. A stop
watch is started and an outlet valve on the line with the flow transmitter is opened.
The tank is allowed to drain and the flow is recorded. After some minutes, the outlet
valve is closed and the stop watch stopped. The volume of fluid is calculated and
divided by the time in seconds to calculate the flow in m3/second. In actually, this
rarely happens but I have had to do on occasion.
For a magmeter, no calibration is done. Just the flow element constants are
confirmed correctly entered into transmitter. For a metering flow meter used for
billing, a 3rd party will usually confirm calibration using a meter prover. A flow meter
could be sent out for verification to the Australian National Measurement Institute
(NMI) if the application required high accuracy and independent confirmation. For an
orifice plate, the plate can be removed and the bore checked to confirm it still has a
sharp edge and is not corroded.

5.9 Gas Detectors and Analysers

To calibrate a gas detector a known gas sample is used. For analysers, there are test
sample solutions available.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 25 of 28
6 CALIBRATION
6.1 Read Internet Document

3051S Reference Manual HART 00809-0100-4801 Rev CB


Read Document
January 2007

Company Rosemount

http://www.emersonprocess.com/rosemount/document/man/48
Website
01b00j.pdf

Pages Section 2 page 4 flow chart, Sections 3 and 4 all pages.

Comment Use in the assignment section to use to answer questions.

6.2 Calibration Introduction

The instrument manufacturer‟s maintenance manual should be used when


performing a calibration or configuration on an instrument. As you become more
experienced as an instrument technician and you continually work on the same
equipment, you will no longer use the manuals for calibration or configuration. At this
stage of your learning you should, as a minimum, skim through the manual.
Continuing from Instrument Specification section, we will be using the Rosemount
3051S HART Manual.

6.3 Reasons for Performing a Calibration

There are a variety of reasons why you may have to perform a calibration on an
instrument. The common reasons are:
 New instrument in a new installation
 Replacement instrument for a faulty instrument
 Preventative maintenance replacement of instrument
 Scheduled shutdown maintenance work
 Scheduled maintenance work in an operating plant
 Instrument fault registering on control system
 Operator does not trust the values displayed
 Process operations change to process fluid parameters
 Process operations change to increase or decrease measurement range of
instrument
 TAFE course requirement
 Other

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 26 of 28
6.4 Calibration Steps

6.4.1 Step 1 Get Required Calibration Range

You cannot calibrate unless you know the calibrated range of the instrument. This
information should be available on the instrument data sheet. If not, then you might
have to calculate values. Most control systems only have the display range
programmed. For some types of measurements like temperature, knowing the
display is -10 to 60 C advises you of the instrument calibration. Obviously, the
transmitter would also have to be calibrated for -10 to 60 C.
For level and some other measurements that use a display range of 0-100%, the
control system will not provide the required information. You will have to determine
from another source. In Transmitters & Valves, you will learn how to perform the
required calculations to determine the calibration range of the instrument.
You need to know the calibrated range to select the correct test equipment.

6.4.2 Step 2 Select Correct Test Equipment

In the odd occasion, with old style transmitters, you can calibrate an instrument with
just a screw driver if you can adjust the process variable by the operating control
system. These applications are limited. For example, you could calibrate a level
transmitter if you had access to an operating control system and could fill the tank to
overflowing and drain completely empty without causing problems in the plant. But in
reality, this does not happen very often.
You will need some form of test equipment to do the job efficiently. Make sure you
select the correct test equipment with the required accuracy to perform the required
calibration. The test equipment selected will also determine where the calibration
must take place. For example bench in shop, field, or either.

6.4.3 Step 3 Determine Best Location for Calibration

The location where the calibration takes place will be determined by the reason for
the calibration, the process operations requirements, and the type of test equipment.
A new transmitter is usually calibrated at the workshop bench. If checking calibration
on a field installed instrument, then you most likely would do calibration in the field if
possible.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 27 of 28
6.4.4 Step 4 Follow the Manufacturer’s Manual to Calibrate

You should use the manufacturer‟s maintenance manual to calibrate an instrument.


Using the Rosemount 3051S HART Manual, the steps required to calibrate and
configure an analogue output are:
 Check Transmitter Output to confirm transmitter is operating correctly and is
configured to correct process variables page 3-7
 If not configured, then configure following each step in Section 3.
 Set Process Variables Units page 3-8
 Set Output Type page 3-8
 Re-range page 3-9
 Set Damping page 3-11
 Calibrate the Sensor Trim page 4-6
 Calibrate the Zero Trim page 4-6
 Calibrate the 4-20 mA Output Trim page 4-8
 Calibrate the 4-20 mA Output Trim Using Other Scale page 4-9

6.4.5 Step 5 Verify the Calibration

No calibration is complete unless you verify the transmitter output signal represents
the correct process variable. If the transmitter is installed in the field, then you should
check with operations to confirm the control system is displaying the correct values
for the calibrated transmitter.

6.4.6 Step 6 Document the Calibration

Make sure you don‟t forget to do the paper work. Every plant should have some
system of documenting when an instrument was calibrated, by whom, and to what
parameters.
Knowing when calibration or replacement of the instrumentation occurred is
important to the planning department and the Asset Management System.
The PAPER WORK is just as IMPORTANT as the actual calibration.

© Challenger Institute of Technology


Printed 27 January 2010 04 Foundation Principles Topic 4 Calibration Content Page 28 of 28

You might also like