You are on page 1of 14

Chapter One: Review of Basic Instrumentation systems

Introduction

Instrumentation engineering is the engineering specialization focused on the principle and


operation of measuring instruments that are used in design and configuration of automated
systems in electrical, pneumatic domains etc. They typically work for industries with automated
processes, such as chemical or manufacturing plants, with the goal of improving system
productivity, reliability, safety, optimization, and stability. To control the parameters in a process
or in a particular system, devices such as microprocessors, microcontrollers or PLCs are used,
but their ultimate aim is to control the parameters of a system.

Instrumentation engineering is loosely defined because the required tasks are very domain
dependent. An expert in the biomedical instrumentation of laboratory rats has very different
concerns than the expert in rocket instrumentation. Common concerns of both are the selection
of appropriate sensors based on size, weight, cost, reliability, accuracy, longevity, environmental
robustness and frequency response. Some sensors are literally fired in artillery shells. Others
sense thermonuclear explosions until destroyed. Invariably sensor data must be recorded,
transmitted or displayed. Recording rates and capacities vary enormously. Transmission can be
trivial or can be clandestine, encrypted and low-power in the presence of jamming. Displays can
be trivially simple or can require consultation with human factors experts. Control system design
varies from trivial to a separate specialty.

Control and instrumentation engineers (C&I engineers) are responsible for integrating the
sensors with the recorders, transmitters, displays or control systems, and producing the Piping
and instrumentation diagram for the process. They may design or specify installation, wiring and
signal conditioning. They may be responsible for calibration, testing and maintenance of the
system.

Instrumentation technologists, technicians and mechanics specialize in troubleshooting, repairing


and maintaining instruments and instrumentation systems.

1
Basics of Instrumentation and measurement

Measurement is the process of determining the amount, degree or capacity by comparison with
the accepted standards of the system units being used.
Instrumentation is a technology of measurement which serves sciences, engineering, medicine
and etc.
Instrument is a device for determining the value or magnitude of a quantity or variable.
Electronic instrument is based on electrical or electronic principles for its measurement functions
Advantages of electronic measurement
 Results high sensitivity rating – the use of amplifier
 Increase the input impedance – thus lower loading effects
 Ability to monitor remote signal

Fig. Functional block diagram of measuring instrument


PERFORMANCE CHARACTERISTICS
Performance Characteristics - characteristics that show the performance of an instrument.
Eg: accuracy, precision, resolution, sensitivity.
Allows users to select the most suitable instrument for a specific measuring jobs.
Two basic characteristics :
 Static
 Dynamic
Accuracy – the degree of exactness (closeness) of measurement compared to the expected
(desired) value.
Resolution – the smallest change in a measurement variable to which an instrument will
respond.
Precision – a measure of consistency or repeatability of measurement, i.e successive reading do
not differ.

2
Expected value – the design value or the most probable value that expect to obtain.
Error – the deviation of the true value from the desired value.
Sensitivity – ratio of change in the output (response) of instrument to a change of input or
measured variable.
ERROR IN MEASUREMENT
Measurement always introduce error
Error may be expressed either as absolute or percentage of error
Absolute error, e =Yn − X n where
Yn –expected value
Xn – measured value
% error = [(Yn − X n )/ Yn ]×100

TYPES OF STATIC ERROR


1) Gross error/human error
2) Systematic Error
3) Random Error
1) Gross Error
- cause by human mistakes in reading/using instruments
- cannot eliminate but can minimize
2) Systematic Error
- due to shortcomings of the instrument (such as defective or worn parts)
types of systematic error :-
(i) Instrumental error
(ii) Environmental error
(iii) Observational error

3
(i) Instrumental error
- inherent while measuring instrument because of their mechanical structure (bearing friction,
irregular spring tension, stretching of spring, etc)
error can be avoid by:
(a) selecting a suitable instrument for the particular measurement application
(b) apply correction factor by determining instrumental error
(c) calibrate the instrument against standard
(ii) Environmental error
- due to external condition effecting the measurement including surrounding area condition
such as change in temperature, humidity, barometer pressure, etc
to avoid the error :-
(a) use air conditioner
(b) sealing certain component in the instruments
(c) use magnetic shields
(iii) Observational error
- introduce by the observer
- most common : parallax error and estimation error (while reading the scale)
3) Random error
- due to unknown causes, occur when all systematic error has accounted
- accumulation of small effect, require at high degree of accuracy
can be avoid by
(a) increasing number of reading
(b) use statistical means to obtain best approximation of true value
Dynamic – measuring a varying process condition.
Instruments rarely respond instantaneously to changes in the measured variables due to such
things as mass, thermal capacitance, fluid capacitance or electrical capacitance.
The three most common variations in the measured quantity:
 Step change
 Linear change
 Sinusoidal change

4
The dynamic characteristics of an instrument are:
 Speed of response
 Dynamic error- ω The difference between the true and measured value with no static
error.
 Lag – response delay
 Fidelity – the degree to which an instrument indicates the changes in the measured
variable without dynamic error (faithful reproduction).
LIMITING ERROR
The accuracy of measuring instrument is guaranteed within a certain percentage (%) of full scale
reading. E.g manufacturer may specify the instrument to be accurate at ±2 % with full scale
deflection. For reading less than full scale, the limiting error increases
ELECTRONIC INSTRUMENT
Selection, care and use of the instrument
 Before using an instrument, you should be thoroughly familiar with its operation ** read
the manual carefully
 Select an instrument to provide the degree of accuracy required (accuracy + resolution +
cost)
 Before using any selected instrument, do the inspection for any physical problem
 Before connecting the instrument to the circuit, make sure the ‘function switch’ and the
‘range selector switch’ has been set-up at the proper function or range

Standards and their classification

Standardization is a word used to describe the exercise of implementing commonly agreed to


rules or requirements to govern the variety of an entity available, for specific reasons. These
reasons vary but often include efficiency, safety, quality and enhancing trade. The entities
themselves vary widely from a product to a test method to rules for conducting an activity.
The process of standardization can have many components but is normally characterized by the
building of consensus among various stakeholders (those who stand to be affected by the
standardization effort) with respect to the rules to which everyone is to abide. The outcome is
usually a ‘standard’ of one form or another. Now the word standard can have different meanings

5
depending on your point of view. The International Standards Organization (ISO), the body
chiefly responsible for formulating standards for application internationally, defines a standard
as: “A standard is the result of a particular standardization effort approved by a recognized
authority”. Therefore, a standard is a physical representation of a unit of measurement. It is a
piece of equipment having a known measure of physical quantity. They are used for the
measurements of other physical quantities by comparison methods.

Standards are documents setting out specifications, procedures and guidelines. They are designed
to ensure products, services and systems are safe, reliable and consistent. They are based on
industrial, scientific and consumer experience and are regularly reviewed to ensure they keep
pace with new technologies. They cover everything from consumer products and services,
construction, engineering, business, information technology, human services to energy and water
utilities, the environment and much more. Generally a standard is:

 Something considered by an authority or by general consent as a basis of comparison; an


approved model.
 an object that is regarded as the usual or most common size or form of its kind or the
authorized exemplar of a unit of weight or measure.
 a rule or principle that is used as a basis for judgment:
 an average or normal requirement, quality, quantity, level, grade, etc.:

Standards of measurements can be classified in to;

i. International standards

ii. Primary standards

iii. Secondary standards

iv. Working standards

i. International standards

These are defined on the basis of international agreement. They represent the units of
measurements which are closest to the possible accuracy attainable with present day

6
technological and scientific methods. International standards are checked and evaluated regularly
against absolute measurements in terms of the fundamental units. These standards are maintained
at the International Bureau of Weights and Measures and are not available to the ordinary user
of measuring instruments for the purposes of calibration or comparison.
Example: One Meter is defined as the length travelled by light in vacuum in a time interval of
1/299792458 sec.

ii. Primary standards (Absolute standards)

The Primary standards are the absolute standards which can be used as the ultimate reference
standards. These standards are maintained by National Standards Laboratories in different parts
of the world. The primary standards which represent the fundamental units are independently
calibrated by absolute measurements at each of the national laboratories. One of the main
functions of the primary standards is the verifications and calibration of secondary standards.
The primary standards are very few in number. These standards have the highest possible
accuracy and stability.

The primary unit of mass is a prototype kilogram kept at National Physical Laboratories of
every country.

iii. Secondary standards

The secondary standards are the basic reference standards used in industrial measurement
laboratories. They are sent periodically to the national standards laboratories for calibration and
comparison against primary standards. The secondary standards are sent back to the industry by
the National laboratories with a certification regarding their measured values in terms of primary
standards. The secondary standards of mass are kept by industrial laboratories. These standards
are checked against the primary standards.

iv. Working standards

These standards are used to check and calibrate general laboratory instrument for their accuracy
and performance. The working standards of mass and length are available in a wide range of

7
values so that, they suit any kind of application. The working standards of mass are checked
against the secondary standards.
Why do we need standards?

Standards facilitate everyday life. They increase safety and can be used to rationalize operations.
Standardization ensures that products, services and methods are appropriate for their intended
use. It ensures that products and systems are compatible and interoperable.
A product manufactured according to standards is accepted in the international markets. Using
standards removes barriers to trade.
All manufacturing and construction as well as installation, repairing and maintenance work is
conducted in accordance with standards. In addition, using systems, devices, and equipment, as
well as operating processes and maintaining them, requires standards.

Standards provide:

 Safety and reliability – Adherence to standards helps ensure safety, reliability and
environmental care. As a result, users perceive standardized products and services as
more dependable – this in turn raises user confidence, increasing sales and the take-up of
new technologies.
 Support of government policies and legislation – Standards are frequently referenced
by regulators and legislators for protecting user and business interests, and to support
government policies. Standards play a central role in the European Union's policy for a
Single Market.
 Interoperability – the ability of devices to work together relies on products and services
complying with standards.
 Business benefits – standardization provides a solid foundation upon which to develop
new technologies and to enhance existing practices. Specifically standards:
o Open up market access
o Provide economies of scale
o Encourage innovation
o Increase awareness of technical developments and initiatives

8
 Consumer choice - standards provide the foundation for new features and options, thus
contributing to the enhancement of our daily lives. Mass production based on standards
provides a greater variety of accessible products to consumers.

What the world would be like without standards?

 Products might not work as expected


 They may be of inferior quality
 They may be incompatible with other equipment – in fact they may not even connect
with them
 In extreme cases, non-standardized products may be dangerous
 Customers would be restricted to one manufacturer or supplier
 Manufacturers would be obliged to invent their own individual solutions to even the
simplest needs, with limited opportunity to compete with others

Quality assurance (QA)

In developing products and services, quality assurance is any systematic process of checking to
see whether a product or service being developed is meeting specified requirements. Many
companies have a separate department devoted to quality assurance. A quality assurance system
is said to increase customer confidence and a company's credibility, to improve work processes
and efficiency, and to enable a company to better compete with others. Quality assurance was
initially introduced in World War II when munitions were inspected and tested for defects after
they were made. Today's quality assurance systems emphasize catching defects before they get
into the final product.

Quality assurance (QA) is a way of preventing mistakes or defects in manufactured products


and avoiding problems when delivering solutions or services to customers; which ISO 9000
defines as "part of quality management focused on providing confidence that quality
requirements will be fulfilled". This defect prevention in quality assurance differs subtly from
defect detection and rejection in quality control, and has been referred to as a shift left as it
focuses on quality earlier in the process.

9
The terms "quality assurance" and "quality control" are often used interchangeably to refer to
ways of ensuring the quality of a service or product. For instance, the term "assurance" is often
used as follows: Implementation of inspection and structured testing as a measure of quality
assurance in a television set software project at Philips Semiconductors is described. The term
"control", however, is used to describe the fifth phase of the DMAIC model. DMAIC is a data-
driven quality strategy used to improve processes.

Quality assurance comprises administrative and procedural activities implemented in a quality


system so that requirements and goals for a product, service or activity will be fulfilled. It is the
systematic measurement, comparison with a standard, monitoring of processes and an associated
feedback loop that confers error prevention. This can be contrasted with quality control, which is
focused on process output.

Two principles included in quality assurance are: "Fit for purpose" (the product should be
suitable for the intended purpose); and "right first time" (mistakes should be eliminated). QA
includes management of the quality of raw materials, assemblies, products and components,
services related to production, and management, production and inspection processes.

QA is not limited to manufacturing, and can be applied to any business or non-business activity,
including: design, consulting, banking, insurance, computer software development, retailing,
investment, transportation, education, and translation. It comprises a quality improvement
process, which is generic in the sense that it can be applied to any of these activities and it
establishes a behavior pattern, which supports the achievement of quality.

In manufacturing and construction activities, these business practices can be equated to the
models for quality assurance defined by the International Standards contained in the ISO 9000
series and the specified Specifications for quality systems.

The International Organization for Standardization (ISO) is an international standard-setting


body composed of representatives from various national standards organizations.

Founded on 23 February 1947, the organization promotes worldwide proprietary, industrial and
commercial standards. It is headquartered in Geneva, Switzerland and as of 2015 works in 163.

10
Common Measurement Units

Metric

 The whole world uses metric units of measurement


 All metric measurement units contain either the words meter, liter, or gram as the base
word.
Example
 Millimeter- about the thickness of a dime.
 Centimeter (cm)- about the width of a Reese’s pieces
 Decimeter (dm)- This is 10 centimeters or about the width of an adult’s hand.
 Meter (m)- About the distance from one hand to another when you stretch your arms.
100 cm = 1 meter
 Kilometer (km)- 1,000 meters. About the distance you can walk in 15 minutes.
 Milligram- About the mass of a grain of salt.
 Gram (g)- is the mass of a dollar bill
 Kilogram (kg)- is about the mass of a baseball bat
 Milliliter (ml)- used to measure medicine.
 Liter (l)- half a two liter of pop
 Celsius (C)- Water freezes at 0o C and boils at 100o C

US Customary

 Used in the United States


 Inch (in.)- distance from thumb to first knuckle
 Foot (ft)- size of a ruler or height of a cat
 Yard (yd)- about the length of a baseball bat
 Mile- distance you can walk in 20 minutes.
 Ounce (oz)- the weight of a slice of bread
 Pound (lb)- the weight of a loaf of bread
 Ton- the mass of a bread truck
 Fluid Ounce (oz)-measurement for baby bottles
11
 Cup (c) -8 ounces
 Pint (p)- 2 cups
 Quart (qt)- 2 pints, 4 cups
 Gallon (g)- 4 quarts
 Fahrenheit (F)- water freezes at 32o F and boils at 212o F

WHAT IS CALIBRATION?
There are as many definitions of calibration as there are methods. According to ISA’s The
Automation, Systems, and Instrumentation Dictionary, the word calibration is defined as “a test
during which known values of measurand are applied to the transducer and corresponding output
readings are recorded under specified conditions.” The definition includes the capability to adjust
the instrument to zero and to set the desired span.

An interpretation of the definition would say that a calibration is a comparison of measuring


equipment against a standard instrument of higher accuracy to detect, correlate, adjust, rectify
and document the accuracy of the instrument being compared. Typically, calibration of an
instrument is checked at several points throughout the calibration range of the instrument. The
calibration range is defined as “the region between the limits within which a quantity is
measured, received or transmitted, expressed by stating the lower and upper range values.” The
limits are defined by the zero and span values.

The zero value is the lower end of the range. Span is defined as the algebraic difference between
the upper and lower range values. The calibration range may differ from the instrument range,
which refers to the capability of the instrument. For example, an electronic pressure transmitter
may have a nameplate instrument range of 0–750 pounds per square inch, gauge (psig) and
output of 4-to-20 milliamps (mA). However, the engineer has determined the instrument will be
calibrated for 0-to-300 psig = 4-to-20 mA. Therefore, the calibration range would be specified as
0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig and zero output value
is 4 mA. The input span is 300 psig and the output span is 16 mA.

Different terms may be used at your facility. Just be careful not to confuse the range the
instrument is capable of with the range for which the instrument has been calibrated.

12
WHAT ARE THE CHARACTERISTICS OF A CALIBRATION?
Calibration Tolerance: Every calibration should be performed to a specified tolerance. The terms
tolerance and accuracy are often used incorrectly. In ISA’s The Automation, Systems, and
Instrumentation Dictionary, the definitions for each are as follows:
Accuracy: The ratio of the error to the full scale output or the ratio of the error to the output,
expressed in percent span or percent reading, respectively.
Tolerance: Permissible deviation from a specified value; may be expressed in measurement
units, percent of span, or percent of reading.
As you can see from the definitions, there are subtle differences between the terms. It is
recommended that the tolerance, specified in measurement units, is used for the calibration
requirements performed at your facility. By specifying an actual value, mistakes caused by
calculating percentages of span or reading are eliminated. Also, tolerances should be specified in
the units measured for the calibration.
For example, you are assigned to perform the calibration of the previously mentioned 0-to-300
psig pressure transmitter with a specified calibration tolerance of ±2 psig. The output tolerance
would be:
(2 psig 300 psig) 16 mA=0.1067 mA
The calculated tolerance is rounded down to 0.10 mA, because rounding to 0.11 mA would
exceed the calculated tolerance. It is recommended that both ±2 psig and ±0.10 mA tolerances
appear on the calibration data sheet if the remote indications and output milliamp signal
are recorded.
Note the manufacturer’s specified accuracy for this instrument may be 0.25% full scale (FS).
Calibration tolerances should not be assigned based on the manufacturer’s specification only.
Calibration tolerances should be determined from a combination of factors. These factors
include:
• Requirements of the process
• Capability of available test equipment
• Consistency with similar instruments at your facility
• Manufacturer’s specified tolerance

13
Instrument calibration prompts

Calibration may be required for the following reasons:

 a new instrument
 after an instrument has been repaired or modified
 when a specified time period has elapsed
 when a specified usage (operating hours) has elapsed
 before and/or after a critical measurement
 after an event, for example
o after an instrument has been exposed to a shock, vibration, or physical damage,
which might potentially have compromised the integrity of its calibration
o sudden changes in weather
 whenever observations appear questionable or instrument indications do not match the
output of surrogate instruments
 as specified by a requirement, e.g., customer specification, instrument manufacturer
recommendation.

In general use, calibration is often regarded as including the process of adjusting the output or
indication on a measurement instrument to agree with value of the applied standard, within a
specified accuracy. For example, a thermometer could be calibrated so the error of indication or
the correction is determined, and adjusted (e.g. via calibration constants) so that it shows the true
temperature in Celsius at specific points on the scale. This is the perception of the instrument's
end-user. However, very few instruments can be adjusted to exactly match the standards they are
compared to. For the vast majority of calibrations, the calibration process is actually the
comparison of an unknown to a known and recording the results. Basically, the purpose of
calibration is for maintaining the quality of measurement as well as to ensure the proper working
of particular instrument.

14

You might also like