Professional Documents
Culture Documents
Measurement Uncertainty
2.1 Definitions Relating to Measuring
Instruments
1. True or actual value: The actual magnitude of a signal input
to a measuring system which can only be approached and never
evaluated.
2. Indicated value: the magnitude of a variable indicated by a
measuring instrument.
3. Correction: The revision applied to the critical value so
that the final result obtained improves the worth of the result is
called correction.
4. Scale readability: indicates the closeness with which the
scale can be read in analog instruments.
5. Tolerance: It is the range of inaccuracy which can be tolerated
in measurements.
Cont.
6. Backlash: the maximum distance or angle through
which any part of a mechanical system may be moved
in one direction without applying appreciable force or
motion to the next part in a mechanical system.
7. Range: lowest and highest values of the stimulus
8. Span: the arithmetic difference between the highest and
lowest values of the input that being sensed.
For example:
i) a load cell for the measurement of forces might have a range of
0 to 50 kN and a span of 50 kN
(ii) Range: 2𝐾𝑁 𝑚2 𝑡𝑜 50𝐾𝑁 𝑚2
Span: 50 − 2 = 48 𝐾𝑁 𝑚2
(iii) Range: −5℃ 𝑡𝑜 90℃:
Span: 90 − (−5) = 95℃
2.3 Static Characteristics
Measurements of applications in which parameter of
interest is more or less constant.
The main static characteristics may be summed up as
follows:
1. Accuracy 6. Dead zone
2. Sensitivity 7. Hysteresis Error
3. Reproducibility 8. Linearity
4. Drift 9. Reliability
5. Static error 10. Resolution
Accuracy and Inaccuracy
Accuracy:
a measure of how close the output reading of the
instrument is to the correct value.
Inaccuracy or measurement uncertainty:
is the extent to which a reading might be wrong and is
often quoted as a percentage of the full-scale (f.s.)
reading of an instrument.
Accuracy of the measured signal depends, upon the following
factors:
Variation of the signal being measured
Intrinsic accuracy of the instrument itself
Accuracy of the observer
Whether or not the quantity is being truly impressed upon the
instrument.
Cont.
𝑒
Relative Accuracy, 𝐴 = 1 − ,
𝑉𝑡
Accuracy in percentage, 𝑎 = 100% − % 𝑒𝑟𝑟𝑜𝑟
𝑜𝑟
𝑎 = 𝐴 × 100
Where: 𝑒 = absolute error, 𝑒 = 𝑉𝑚 − 𝑉𝑡
𝑉𝑡 = true value
Accuracy and Precision
Accuracy: The closeness with which an instrument reading
approaches the true value of the quantity being measured is
called accuracy.
Precision: A measure of the consistency or repeatability of
measurements.
Let consider the results of tests on three industrial robots
programmed to place components at a particular point on a table.
The target point was at the center of the concentric circles shown,
and black dots represent points where each robot actually deposited
components at each attempt.
a) Low precision, low accuracy b) High precision, low accuracy C) High precision, high accuracy
2.3.2 Sensitivity
Sensitivity is the relationship indicating how much
output there is per unit input, i.e. 𝒐𝒖𝒕𝒑𝒖𝒕/𝒊𝒏𝒑𝒖𝒕.
Sensitivity has a wide range of units and these
depend upon the instrument or measurement system
being investigated.
For example, a resistance thermometer may have a sensitivity of
0.5Ω/°𝐶.
This term is also frequently used to indicate the sensitivity to
inputs other than that being measured, i.e. environmental
changes.
A transducer for the measurement of pressure might be quoted
as having a temperature sensitivity of ±0.1%of the reading per
°C change in temperature.
Repeatability and Reproducibility
Repeatability: It pertains to the closeness of output
readings when the same input is applied repetitively over a
short period of time with the same measurement
conditions, same instrument and observer, same location
and same conditions of use maintained throughout.
Reproducibility: It relates to the closeness of output
readings for the same input when there are changes in the
method of measurement, observer, measuring instrument,
location, conditions of use and time of measurement.
Usually associated with calibration
Given as percentage of input full scale of the maximum
difference between two readings taken at different times
under identical input conditions.
𝑚𝑎𝑥 − 𝑚𝑖𝑛. 𝑣𝑎𝑙𝑢𝑒𝑠 𝑔𝑖𝑣𝑒𝑛
R𝑒𝑝𝑒𝑎𝑡𝑎𝑏𝑖𝑙𝑖𝑡𝑦 = × 100
𝑓𝑢𝑙𝑙 𝑟𝑎𝑛𝑔𝑒
Drift
Undesired gradual departure of the instrument output
over a period of time that is unrelated to changes in input,
operating conditions or load.
An instrument is said to have no drift if it reproduces
same readings at different times for same variation in
measured variable.
The drift may be caused by the following factors:
High mechanical stresses developed in some parts of
instruments and systems
Wear and tear
Mechanical vibrations
Temperature changes
Stray electric and magnetic fields
Thermal e.m.fs.
Classification of drift:
1. Zero drift: the whole of instrument calibration
gradually shifts over by the same amount.
It may be due to permanent set or slippage and can
be corrected by shifting pointer position.
2. Span drift: span (or It may be due to change in spring
gradient etc.
3. Zonal drift: When the drift occurs only over a
portion of span of an instrument.
In industrial instrument, drift is an undesirable
quantity since it is rarely apparent and cannot be easily
compensated for.
Drift occurs very slowly and can be checked only by
periodic inspection and maintenance.
Static error
The difference between the best measured value and the
true value of the quantity.
The absolute value of error does not indicate precisely the
accuracy of measurement.
Thus another term relative static error is introduced.
The relative static error is the ratio of absolute static error to
the true value of the quantity under measurement.
Absolute Error (Es )
Relative static error (Er ) =
True Value (Vt )
Percentage static error % Er = Er x 100
Static Correction: is the difference between the true
value and the measured value of the quantity,
i.e δC = At - Am
Deadband/time
The dead band or dead space of a instrument is the
range of input values for which there is no output.
For example, bearing friction in a flow-meter using a rotor
might mean that there is no output until the input has
reached a particular velocity threshold.
The dead time is the length of time from the application
of an input until the output begins to respond and change.
A device should not operate in this range unless this
insensitivity is acceptable.
Hysteresis Error
The maximum differences in output at
any measured value when approaching
the point first with increasing and then
with decreasing input.
Caused by electrical or mechanical
systems
Magnetization
Thermal properties
Loose linkages
Figure: (a) end-range values (b) best straight line for all values
Reliability
Reliability: a statistical measure of quality of a device
which indicates the ability of the device to perform its
stated function, under normal operating conditions
without failure for a stated period of time or number of
cycles.
Given in hours, years or in MTBF
Usually provided by the manufacturer
Based on accelerated lifetime testing
Resolution
The smallest change of input from which there will be a
change of output.
In case of analog instruments, resolution is determined by
the observer’s ability to judge the position of a pointer on
scale.
In digital systems generally, resolution may be specified as
1/ 2𝑁 (N is the number of bit.)
Example: a digital voltmeter with resolution of 0.1V is
used to measure the output of a sensor.
The change in input (temperature, pressure, etc.) that
will provide a change of 0.1V on the voltmeter is the
resolution of the sensor/voltmeter system.
Measurement Uncertainty
Errors are a property of the measurement.
Error causes a difference between the value assigned by
measurement and the true value of the population of the variable.
Uncertainty is a property of the result.
The outcome of a measurement is a result, and the uncertainty
quantifies the quality of that result.
Uncertainty analysis provides a powerful design tool for
evaluating different measurement systems and methods,
designing a test plan, and reporting uncertainty.
Errors Vs Uncertainty