Professional Documents
Culture Documents
Chapter 1 Introduction and Basic Concepts PDF
Chapter 1 Introduction and Basic Concepts PDF
Metrology is defined as the science of measurement. The measurement is based on units and
standards that have been agreed at an international level. Measurement is the language used by
scientists and engineers to communicate about size, quantity, position, condition and time. In
general, metrology concerns the measurement of various types of physical quantities such as
temperature, length, mass, pressure, flow rate, current, voltage, velocity, acceleration, strain etc. The
aspect of metrology discussed in this book concerns the measurement of dimensions, hence called
dimensional metrology. Dimensional metrology concerns the measurement of length, angle, thickness,
diameter, straightness, flatness, parallelism, perpendicularity and surface form.
Metrology is not just limited to the measurement of physical quantities mentioned above. It
also involves industrial inspection and all the techniques relating to it, such as statistical data
analysis, quality control etc. Inspection practiced in the industries normally involves inspection of a
product at each stage of manufacture up to the final product. The inspection is carried out using
various gages and measuring instruments. The duty of a metrologist not only involves the use of
various methods of measurement to obtain the degree of accuracy desired, but also involves the
aspects of design, construction and testing of various types of gages and measuring instruments.
Measurements are made for three main reasons. Firstly, measurements are made to make a
product, regardless of whether the product is made by us or by someone else. Secondly,
measurements are made to control the way others make the product. This applies to simple things
such as when ordering a window panel or as complex as producing millions of bolts and nuts.
Thirdly, measurements are needed for scientific description. It is impossible to convey definite
information about something without measurement, for example the temperature in a furnace or a
speed of an object.
Inspection is carried out to control the quality of products. In modern manufacturing, the
production process of most products, such as cars, computers, electrical appliances etc., is broken
down into the production of the individual components that make up the product. These processes
may be carried out in the same factory or in different factories. All the components are then
gathered and assembled to produce the final product. Therefore, if any two components are
selected at random it should be possible to assemble them easily and at the same time meet the
product specifications. The dimensions on each component of the final product must be within the
range of dimensions set earlier to enable the assembly of the product easily. This requires inspection
of the dimensions. Thus, dimensional measurement and inspecction play important roles in the
quality control of a product. This is illustrated in Figure 1.1.
Digital micrometer
Profile projector
Digital height gage
Figure 1.2. Modern measuring instruments. (Source: TESA)
1.3 Calibration
Calibration is the comparison of the measuring instrument’s reading with the reading of a
standard instrument when the same input is given to both. Calibration is necessary after the
instrument is used for a long time because the reading can change over time due to several factors,
such as deterioration of the internal components of the instrument. Calibration can be carried out
in-house or the instruments can be sent to specialized metrology laboratories that carry out such
services.
Two very important terms in the science of measurement are precision and accuracy. Precision of
a measuring instrument is the degree of repeatability of the readings when measurement of the same
standard is repeated. Meanwhile, accuracy is the degree of closeness of the measured values with the
actual dimension.
The difference between precision and accuracy becomes clear when we refer to the example
shown in Figure 1.3(a)-(c). Assume that we take six measurements on a standard block of dimension
20.00 mm using three different instruments. In Figure 1.3(a) the readings taken have small deviation
from one another, but the mean reading deviates greatly from the actual dimension of 20.00 mm.
This illustrates an instrument having high precision but low accuracy. Figure 1.3(b) shows the
characteristics of an instrument having low precision but high accuracy. The precision is low
because the readings deviate greatly from one another, but the accuracy is high because the mean
value is close to the actual value. Figure 1.3(c) shows a case where the instrument has both high
It is common to take several measurements using an instrument and calculate the mean
value. The actual value is then subtracted from the mean value to determine the error. The error can
take positive or negative values. Positive error means that the measured value is higher than the
actual value, while negative error implies that the measured value is less than the actual value. When
the error is low the instrument is said to have high accuracy and vice-versa.
There are several differences between error and accuracy. These are summarized in Table
1.1. The accuracy of an instrument is expressed with positive and negative values, e.g. ± 0.005 mm.
The actual measurement obtained using the instrument lies anywhere between the lower and upper
limits. For instance, if an dial caliper has an accuracy of ± 0.005 mm and the reading shown in 0.020
mm, the measured value can lie anywhere in between 0.015 and 0.025 mm.
Readings taken
22.02 mm
× × × × 21.12 mm
× × Mean value
Error 22.32 mm
(a) Actual value 22.95 mm
22.41 mm
20.00 mm 21.02 mm
High precision
Low accuracy
Reading number
Readings taken
× 22.02 mm
Mean value
(b) × × 19.32 mm
Error 22.12 mm
× × Actual value
× 23.95 mm
20.00 mm 19.20 mm
18.90 mm
Low precision
High accuracy
Reading number
Readings taken
Reading number
The basic objective of metrology is to design a measurement system of high accuracy at the
lowest possible cost. In general, the cost of an instrument increases exponentially with the accuracy
required, i.e. an . The relationship between cost and accuracy can be shown in Figure 1.4.
Cost
Accuracy
High instrument accuracy can be obtained if the sources of errors caused by the following
elements can be reduced or eliminated:
Value
Decreasing
Hysteresis
Error
Increasing
Readings
Resolution of an instrument refers to the smallest dimension that can be read using the
instrument. It is given by the smallest graduation on the instrument’s scale (Figure 1.7). The
resolution of some instruments, such as dial gage, depends on the distance between two graduations
and the width of the needle. If the width of the needle is large compared to the distance between
two graduations the instrument will have a low resolution. This is because a small movement of the
needle will not produce any noticeable change in the reading.
Sensitivity of a measuring instrument is defined as the minimum input that will produce a
detectable output. Sensitivity depends on the amplification present in the instrument. Normally,
high sensitivity is needed to enable measurement with high precision. When the sensitivity and
accuracy of a measuring instrument is high its reliability is also high.
Resolution = 0.01 in
The repeatability of a measuring instrument is defined as the closeness between the readings
obtained from repeated measurement of the same quantity (or input) carried out under the same
conditions. These conditions are known as repeatability conditions. Examples of repeatability
conditions are measurement procedure, operator making the measurement, measuring instrument
used, location where measurement is carried out and time duration between measurements. The
time duration between measurements must be short so that instrument does not deteriorate with
time, such as undergo zero drift.
Reproducibility of an instrument is defined as the closeness between the readings obtained
from repeated measurements of the same quantity but under different conditions. Statement of
reproducibility requires specification of the condition that is changed. The condition that is changed
may be the person making the measurement, the place where measurement is made, the
measurement procedure etc.
Uncertainty is defined as a parameter that characterizes the dispersion of the readings taken.
In statistical analysis of the data this parameter may be the standard deviation of the readings.
Expression of uncertainty in given in more detail in Reference (1).
1.8 Tolerance
Tolerance is defined as the variation in the dimension allowed in a component. Unlike the
other terms learned in the foregoing sections, tolerance is normally applied to the product and not
the measuring instrument. For instance, if the nominal size of a component is 20 mm and the
manufacturing process produces parts that have a maximum size of 20.1 mm and a minimum size
of 19.9 mm, thus the tolerance of the part component is ±0.1 mm. Tolerances like this are
unavoidable in the manufacturing process because of the difficulty in producing any two parts to
the exact dimensions. Close tolerance normally requires precise machining and hence the
manufacturing cost increases. Therefore, close tolerance must be avoided unless it is required in
order to ensure that the component functions as intended.
Question 1.1
Figure below shows two types of dial gages. Which of statements given best describes the
differences between these gages?
Gage A Gage B
Question 1.2
The precision and accuracies of three digital calipers A, B and C were compared by calculating the
mean and standard deviation of a set of readings taken using each caliper. Each set of reading was
recorded by repeating the measurement on a 20mm block gage five times. The readings are
tabulated as follows:
Use the statistical functions on your calculator or manual calculation using the given formulae.
[Given that for a set of readings x1, x2, x3 …… xn,, the mean and standard deviation are given by
x + x 2 + x3 + .......... + x n
x= 1 , σ=
∑ (x i − x)
2
]
n n −1