Professional Documents
Culture Documents
By
Dr. Malik Al Amayreh
© Amayreh2013 Measurements 1
General measurement scheme
© 2003Salaymeh Measurements 2
►Definitions of terms:
Readability: It indicates the closeness with which the scale of the
instrument may be read; An instrument with a 12-in would have a
higher readability than an instrument with a 6-in scale and the same
range.
Readability depends on: scale length, spacing of graduations, size of
pointer (or pen if a recorder is used).
Pressure Sensor
Example: Sensitivity
© 1320Amayreh Measurements 3
►Definitions of terms:
© 1320Amayreh Measurements 4
►Definitions of terms:
Hysteresis: Difference in the
readings (output) of the
instrument depending on
weather the value of the
measured quantity is
approached from above or
below.
Hystersis may be the result of
mechanical friction, magnetic
effects, elastic deformation,
thermal effects.
►It is seen that the instrument could not be depended on for an accuracy of
better than 5% (5v), while a precision of + 1% is indicated, since the
maximum deviation from the mean reading of 104 v is 1 v.
Rmax(%)= T min
*100%
T max T min
© 2003Amayreh Measurements 23
analog stopwatch
Example
• Let's look at the resolution of the two
stopwatches:
Measurements 24
Examples
© 2003Salaymeh Measurements 25
►Definitions of terms:
Threshold: starting from a
measurand of value zero, the
smallest initial increment in the
measurand that results in a
detectable output is the threshold.
Measurements 29
Introduction to Measurement
Systems
Standards :
When a measurement system is calibrated it is
compared with some standard whose value is
presumably known.
A dimension defines a physical variable that is used to
describe some aspect of a physical system.
The fundamental value associated with any dimension
is given by a unit .A unit defines a measure of a
dimension .
Measurements 30
Introduction to Measurement
Systems
Mass :
Originally, the unit of the kilogram was defined by the mass of one
liter of water at room temperature.
Today, the kilograms are international Bureau of weights and
Measures located in Sevres ,France.
In the U.S engineering unit system, mass is defined by the pound-
mass, Ibm, which can be derived from the definition of the
kilogram.
Ibm = 0.45359237 kg.
Equivalent Standards for the kilogram and the pounds_ mass are
maintained by the U.S National Bureau of Standards, or NBS) in
Gaithersburg ,Maryland.
Measurements 31
Introduction to Measurement
Systems
Time:
The dimension of time is defined by the unit of a second in both SI
and U.S engineering units .
One second has been defined as the time elapsed during
9,192,631,770 periods of radiation emitted between two excitation
levels of the fundamental state of cesium-133.)
Measurements 32
Introduction to Measurement
Systems
Temperature:
The basic SI unit of a Kelvin, K, defines the absolute
practical scale.
Kelvin is based on polynomial interpolation between the
equilibrium phase change points of a number of common
pure substance from the triple point of equilibrium
hydrogen (13.81 k) to the freezing point of pure gold.
(1337.58 k).
oC = K – 273.15
oF = oR – 459.67
oF = 1.8 oC + 32.0
Thermometer with
Fahrenheit (marked on
outer bezel)
and Celsius(marked on
Measurements 34
inner dial) degree units
Introduction to Measurement
Systems
Electrical Units:
Ampere: Unit of current
Amperes are used to express flow rate of electric charge
Ohm: It is defined by 0.9995 times the resistance to current
flow of a column of mercury that is 1.063 m in length and has a
mass of 0.0144521 kg at 273.1 k.
Measurements 35
Introduction to Measurement
Systems
Measurement:
An act of assigning a specific value to
physical value (measurand).
Quantitative comparison between a
predefined standard and
measurand.
Instrument
Input Measurand Readout Result
Process
►The standard of comparison is usually
defined by an organization of
agency. i.e.:
ISO: International Standards
Organization.
National Bureau of Standards (NBS).
American National Standards
Institution (ANSI).
Measurements 36
Introduction to Measurement
Systems
Fundamental calibration Method of Measurement:
(a) Direct comparison with a standard.
(b) In direct comparison with a standard .(using a
calibrated system or a measuring device).
• Method of Measurement:
Analog:
Measured vary in a continuous manner over a range of
magnitudes. (Most measurements).
2 Digital:
Changing in stepwise between two distinct magnitudes.
Measurements 37
Introduction to Measurement
Systems
► Validity of measurements:
Measurements 38
Introduction to Measurement
Systems
Error = Measured value-True value.
► Experimental errors generally fall into two categories:
(a) Bias error (fixed or Systematic error).
(b) Precision errors (Random error).
Measurements 39
Introduction to Measurement
Systems
►Sources of Bios errors;
(1) Calibration errors:
• Calibration process
• Non linearity (Input, output
treated as if they have
linear relation ship although
(2) Loading errors: it is actually non linear)
Alternation of the measurand due to the insertion of the measuring device
(e.g. Temp)
The measuring devices with negligible loading errors are called
non_Intrusive.
(3) Related errors (Spatial Errors): The measuring system is affected by
variables other than the measurand.
Measurements 40
Introduction to Measurement
Systems
Precision errors:
The errors caused by a lack of repeatability in the output of
the measuring system.
Precision error = Reading - an average from readings of
the meaurand.
Measurements 41
Introduction to Measurement
Systems
►Example:
In a calibration test 10 measurements using a digital voltmeter have been
made of voltage a battery that is known to have a true vale of 6.11 V The
readings are: 5.98, 6.05, 6.10, 6.06, 5.99, 5.96, 6.02, 6.02, 6.09, 6.03,
5.99V.Estimate the bias and the maximum precision errors caused by the
voltmeter.
Solution:
Average of readings = 6.03 V.
Bias error = average value – true value
= 6.03 - 6.11
= -0.08 V.
Max. Precision error = max. Deviation reading – Average
= 5.96 - 6.03
= -0.07 VOLT. #
Measurements 42
Measurements 43
Measurements 44