Professional Documents
Culture Documents
Introduction
Instrumentation engineering is loosely defined because the required tasks are very domain
dependent. An expert in the biomedical instrumentation of laboratory rats has very different
concerns than the expert in rocket instrumentation. Common concerns of both are the selection
of appropriate sensors based on size, weight, cost, reliability, accuracy, longevity, environmental
robustness and frequency response. Some sensors are literally fired in artillery shells. Others
sense thermonuclear explosions until destroyed. Invariably sensor data must be recorded,
transmitted or displayed. Recording rates and capacities vary enormously. Transmission can be
trivial or can be clandestine, encrypted and low-power in the presence of jamming. Displays can
be trivially simple or can require consultation with human factors experts. Control system design
varies from trivial to a separate specialty.
Control and instrumentation engineers (C&I engineers) are responsible for integrating the
sensors with the recorders, transmitters, displays or control systems, and producing the Piping
and instrumentation diagram for the process. They may design or specify installation, wiring and
signal conditioning. They may be responsible for calibration, testing and maintenance of the
system.
1
Basics of Instrumentation and measurement
Measurement is the process of determining the amount, degree or capacity by comparison with
the accepted standards of the system units being used.
Instrumentation is a technology of measurement which serves sciences, engineering, medicine
and etc.
Instrument is a device for determining the value or magnitude of a quantity or variable.
Electronic instrument is based on electrical or electronic principles for its measurement functions
Advantages of electronic measurement
Results high sensitivity rating – the use of amplifier
Increase the input impedance – thus lower loading effects
Ability to monitor remote signal
2
Expected value – the design value or the most probable value that expect to obtain.
Error – the deviation of the true value from the desired value.
Sensitivity – ratio of change in the output (response) of instrument to a change of input or
measured variable.
ERROR IN MEASUREMENT
Measurement always introduce error
Error may be expressed either as absolute or percentage of error
Absolute error, e =Yn − X n where
Yn –expected value
Xn – measured value
% error = [(Yn − X n )/ Yn ]×100
3
(i) Instrumental error
- inherent while measuring instrument because of their mechanical structure (bearing friction,
irregular spring tension, stretching of spring, etc)
error can be avoid by:
(a) selecting a suitable instrument for the particular measurement application
(b) apply correction factor by determining instrumental error
(c) calibrate the instrument against standard
(ii) Environmental error
- due to external condition effecting the measurement including surrounding area condition
such as change in temperature, humidity, barometer pressure, etc
to avoid the error :-
(a) use air conditioner
(b) sealing certain component in the instruments
(c) use magnetic shields
(iii) Observational error
- introduce by the observer
- most common : parallax error and estimation error (while reading the scale)
3) Random error
- due to unknown causes, occur when all systematic error has accounted
- accumulation of small effect, require at high degree of accuracy
can be avoid by
(a) increasing number of reading
(b) use statistical means to obtain best approximation of true value
Dynamic – measuring a varying process condition.
Instruments rarely respond instantaneously to changes in the measured variables due to such
things as mass, thermal capacitance, fluid capacitance or electrical capacitance.
The three most common variations in the measured quantity:
Step change
Linear change
Sinusoidal change
4
The dynamic characteristics of an instrument are:
Speed of response
Dynamic error- ω The difference between the true and measured value with no static
error.
Lag – response delay
Fidelity – the degree to which an instrument indicates the changes in the measured
variable without dynamic error (faithful reproduction).
LIMITING ERROR
The accuracy of measuring instrument is guaranteed within a certain percentage (%) of full scale
reading. E.g manufacturer may specify the instrument to be accurate at ±2 % with full scale
deflection. For reading less than full scale, the limiting error increases
ELECTRONIC INSTRUMENT
Selection, care and use of the instrument
Before using an instrument, you should be thoroughly familiar with its operation ** read
the manual carefully
Select an instrument to provide the degree of accuracy required (accuracy + resolution +
cost)
Before using any selected instrument, do the inspection for any physical problem
Before connecting the instrument to the circuit, make sure the ‘function switch’ and the
‘range selector switch’ has been set-up at the proper function or range
5
depending on your point of view. The International Standards Organization (ISO), the body
chiefly responsible for formulating standards for application internationally, defines a standard
as: “A standard is the result of a particular standardization effort approved by a recognized
authority”. Therefore, a standard is a physical representation of a unit of measurement. It is a
piece of equipment having a known measure of physical quantity. They are used for the
measurements of other physical quantities by comparison methods.
Standards are documents setting out specifications, procedures and guidelines. They are designed
to ensure products, services and systems are safe, reliable and consistent. They are based on
industrial, scientific and consumer experience and are regularly reviewed to ensure they keep
pace with new technologies. They cover everything from consumer products and services,
construction, engineering, business, information technology, human services to energy and water
utilities, the environment and much more. Generally a standard is:
i. International standards
i. International standards
These are defined on the basis of international agreement. They represent the units of
measurements which are closest to the possible accuracy attainable with present day
6
technological and scientific methods. International standards are checked and evaluated regularly
against absolute measurements in terms of the fundamental units. These standards are maintained
at the International Bureau of Weights and Measures and are not available to the ordinary user
of measuring instruments for the purposes of calibration or comparison.
Example: One Meter is defined as the length travelled by light in vacuum in a time interval of
1/299792458 sec.
The Primary standards are the absolute standards which can be used as the ultimate reference
standards. These standards are maintained by National Standards Laboratories in different parts
of the world. The primary standards which represent the fundamental units are independently
calibrated by absolute measurements at each of the national laboratories. One of the main
functions of the primary standards is the verifications and calibration of secondary standards.
The primary standards are very few in number. These standards have the highest possible
accuracy and stability.
The primary unit of mass is a prototype kilogram kept at National Physical Laboratories of
every country.
The secondary standards are the basic reference standards used in industrial measurement
laboratories. They are sent periodically to the national standards laboratories for calibration and
comparison against primary standards. The secondary standards are sent back to the industry by
the National laboratories with a certification regarding their measured values in terms of primary
standards. The secondary standards of mass are kept by industrial laboratories. These standards
are checked against the primary standards.
These standards are used to check and calibrate general laboratory instrument for their accuracy
and performance. The working standards of mass and length are available in a wide range of
7
values so that, they suit any kind of application. The working standards of mass are checked
against the secondary standards.
Why do we need standards?
Standards facilitate everyday life. They increase safety and can be used to rationalize operations.
Standardization ensures that products, services and methods are appropriate for their intended
use. It ensures that products and systems are compatible and interoperable.
A product manufactured according to standards is accepted in the international markets. Using
standards removes barriers to trade.
All manufacturing and construction as well as installation, repairing and maintenance work is
conducted in accordance with standards. In addition, using systems, devices, and equipment, as
well as operating processes and maintaining them, requires standards.
Standards provide:
Safety and reliability – Adherence to standards helps ensure safety, reliability and
environmental care. As a result, users perceive standardized products and services as
more dependable – this in turn raises user confidence, increasing sales and the take-up of
new technologies.
Support of government policies and legislation – Standards are frequently referenced
by regulators and legislators for protecting user and business interests, and to support
government policies. Standards play a central role in the European Union's policy for a
Single Market.
Interoperability – the ability of devices to work together relies on products and services
complying with standards.
Business benefits – standardization provides a solid foundation upon which to develop
new technologies and to enhance existing practices. Specifically standards:
o Open up market access
o Provide economies of scale
o Encourage innovation
o Increase awareness of technical developments and initiatives
8
Consumer choice - standards provide the foundation for new features and options, thus
contributing to the enhancement of our daily lives. Mass production based on standards
provides a greater variety of accessible products to consumers.
In developing products and services, quality assurance is any systematic process of checking to
see whether a product or service being developed is meeting specified requirements. Many
companies have a separate department devoted to quality assurance. A quality assurance system
is said to increase customer confidence and a company's credibility, to improve work processes
and efficiency, and to enable a company to better compete with others. Quality assurance was
initially introduced in World War II when munitions were inspected and tested for defects after
they were made. Today's quality assurance systems emphasize catching defects before they get
into the final product.
9
The terms "quality assurance" and "quality control" are often used interchangeably to refer to
ways of ensuring the quality of a service or product. For instance, the term "assurance" is often
used as follows: Implementation of inspection and structured testing as a measure of quality
assurance in a television set software project at Philips Semiconductors is described. The term
"control", however, is used to describe the fifth phase of the DMAIC model. DMAIC is a data-
driven quality strategy used to improve processes.
Two principles included in quality assurance are: "Fit for purpose" (the product should be
suitable for the intended purpose); and "right first time" (mistakes should be eliminated). QA
includes management of the quality of raw materials, assemblies, products and components,
services related to production, and management, production and inspection processes.
QA is not limited to manufacturing, and can be applied to any business or non-business activity,
including: design, consulting, banking, insurance, computer software development, retailing,
investment, transportation, education, and translation. It comprises a quality improvement
process, which is generic in the sense that it can be applied to any of these activities and it
establishes a behavior pattern, which supports the achievement of quality.
In manufacturing and construction activities, these business practices can be equated to the
models for quality assurance defined by the International Standards contained in the ISO 9000
series and the specified Specifications for quality systems.
Founded on 23 February 1947, the organization promotes worldwide proprietary, industrial and
commercial standards. It is headquartered in Geneva, Switzerland and as of 2015 works in 163.
10
Common Measurement Units
Metric
US Customary
WHAT IS CALIBRATION?
There are as many definitions of calibration as there are methods. According to ISA’s The
Automation, Systems, and Instrumentation Dictionary, the word calibration is defined as “a test
during which known values of measurand are applied to the transducer and corresponding output
readings are recorded under specified conditions.” The definition includes the capability to adjust
the instrument to zero and to set the desired span.
The zero value is the lower end of the range. Span is defined as the algebraic difference between
the upper and lower range values. The calibration range may differ from the instrument range,
which refers to the capability of the instrument. For example, an electronic pressure transmitter
may have a nameplate instrument range of 0–750 pounds per square inch, gauge (psig) and
output of 4-to-20 milliamps (mA). However, the engineer has determined the instrument will be
calibrated for 0-to-300 psig = 4-to-20 mA. Therefore, the calibration range would be specified as
0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig and zero output value
is 4 mA. The input span is 300 psig and the output span is 16 mA.
Different terms may be used at your facility. Just be careful not to confuse the range the
instrument is capable of with the range for which the instrument has been calibrated.
12
WHAT ARE THE CHARACTERISTICS OF A CALIBRATION?
Calibration Tolerance: Every calibration should be performed to a specified tolerance. The terms
tolerance and accuracy are often used incorrectly. In ISA’s The Automation, Systems, and
Instrumentation Dictionary, the definitions for each are as follows:
Accuracy: The ratio of the error to the full scale output or the ratio of the error to the output,
expressed in percent span or percent reading, respectively.
Tolerance: Permissible deviation from a specified value; may be expressed in measurement
units, percent of span, or percent of reading.
As you can see from the definitions, there are subtle differences between the terms. It is
recommended that the tolerance, specified in measurement units, is used for the calibration
requirements performed at your facility. By specifying an actual value, mistakes caused by
calculating percentages of span or reading are eliminated. Also, tolerances should be specified in
the units measured for the calibration.
For example, you are assigned to perform the calibration of the previously mentioned 0-to-300
psig pressure transmitter with a specified calibration tolerance of ±2 psig. The output tolerance
would be:
(2 psig 300 psig) 16 mA=0.1067 mA
The calculated tolerance is rounded down to 0.10 mA, because rounding to 0.11 mA would
exceed the calculated tolerance. It is recommended that both ±2 psig and ±0.10 mA tolerances
appear on the calibration data sheet if the remote indications and output milliamp signal
are recorded.
Note the manufacturer’s specified accuracy for this instrument may be 0.25% full scale (FS).
Calibration tolerances should not be assigned based on the manufacturer’s specification only.
Calibration tolerances should be determined from a combination of factors. These factors
include:
• Requirements of the process
• Capability of available test equipment
• Consistency with similar instruments at your facility
• Manufacturer’s specified tolerance
13
Instrument calibration prompts
a new instrument
after an instrument has been repaired or modified
when a specified time period has elapsed
when a specified usage (operating hours) has elapsed
before and/or after a critical measurement
after an event, for example
o after an instrument has been exposed to a shock, vibration, or physical damage,
which might potentially have compromised the integrity of its calibration
o sudden changes in weather
whenever observations appear questionable or instrument indications do not match the
output of surrogate instruments
as specified by a requirement, e.g., customer specification, instrument manufacturer
recommendation.
In general use, calibration is often regarded as including the process of adjusting the output or
indication on a measurement instrument to agree with value of the applied standard, within a
specified accuracy. For example, a thermometer could be calibrated so the error of indication or
the correction is determined, and adjusted (e.g. via calibration constants) so that it shows the true
temperature in Celsius at specific points on the scale. This is the perception of the instrument's
end-user. However, very few instruments can be adjusted to exactly match the standards they are
compared to. For the vast majority of calibrations, the calibration process is actually the
comparison of an unknown to a known and recording the results. Basically, the purpose of
calibration is for maintaining the quality of measurement as well as to ensure the proper working
of particular instrument.
14