You are on page 1of 42

EXCEL ENGINEERING COLLEGE

DEPARTMENT OF MECHANICAL
ENGINEERING

PREPARED BY
Mr.N.TAMILSELVAN
Assistant Professor
ME8501 METROLOGY AND
MEASUREMENTS

Unit-I Basics of Metrology


Syllabus
UNIT I BASICS OF METROLOGY 
Introduction to Metrology – Need – Elements – Work piece, Instruments – Persons
– Environment – their effect on Precision and Accuracy – Errors – Errors in
Measurements – Types – Control – Types of standards.
UNIT II LINEAR AND ANGULAR MEASUREMENTS 
Linear Measuring Instruments – Evolution – Types – Classification – Limit gauges
– gauge design – terminology – procedure – concepts of interchange ability and
selective assembly – Angular measuring instruments – Types – Bevel protractor
clinometers angle gauges, spirit levels sine bar – Angle alignment telescope –
Autocollimator – Applications.
UNIT III ADVANCES IN METROLOGY 
Basic concept of lasers Advantages of lasers – laser Interferometers – types – DC and
AC Lasers interferometer – Applications – Straightness – Alignment. Basic concept of
CMM – Types of CMM – Constructional features – Probes – Accessories – Software
– Applications – Basic concepts of Machine Vision System – Element – Applications.
UNIT IV FORM MEASUREMENT 
Principles and Methods of straightness – Flatness measurement – Thread
measurement, gear measurement, surface finish measurement, Roundness
measurement – Applications.
UNIT V MEASUREMENT OF POWER, FLOW AND TEMPERATURE 
Force, torque, power – mechanical, Pneumatic, Hydraulic and Electrical type. Flow
measurement: Venturimeter, Orifice meter, rotameter, pitot tube – Temperature:
bimetallic strip, thermocouples, electrical resistance thermometer – Reliability and
Calibration – Readability and Reliability
OUTCOMES:
Upon the completion of this course the students will be able to
CO1 Describe the concepts of measurements to apply in various metrological
instruments
CO2 Outline the principles of linear and angular measurement tools used for
industrial applications
CO3 Explain the procedure for conducting computer aided inspection
CO4 Demonstrate the techniques of form measurement used for industrial
components
CO5 Discuss various measuring techniques of mechanical properties in industrial
applications
TEXT BOOKS:
1. Gupta. I.C., “Engineering Metrology”, Dhanpatrai Publications, 2005.
2. Jain R.K. “Engineering Metrology”, Khanna Publishers, 2009.
objective
 To provide knowledge on various Metrological
equipments available to measure the dimension
of the components.
 To provide knowledge on the correct procedure
to
be adopted to measure the dimension of the
components.
INTRODUCTION
Definitions:
The word metrology is derived from two Greek words

Metro = measurement
Logy = science
 Thus metrology is the science of measurement
Metrology is the field of knowledge concerned with
measurement and includes both theoretical and practical
problems with reference to measurement.
 Metrology is the science of weights and measures
Metrology is the science concerned with the establishment,
reproduction, conversion and transfer of units of
measurement and their standards
Need of Inspection
• To ensure the material, parts and components conform to
the established standards
• To meet the interchangeability of manufacture
• To provide the means of finding the problem area
for meeting the established standards
• To produce the parts having acceptable quality
levels with reduced scrap and wastage
• To purchase good quality of raw materials, tools
and equipment that govern the quality of finished
products
• To take necessary efforts to measure and reduce
the rejection percentage
• To judge the possibility of rework of defective parts
Elements of
• Standard Metrology
• The most basic element of measurement is standard without
which no measurement is possible.
• Standard is a physical representation of unit of measurement.
• Different standards have been developed for various
units
including fundamental units as well as derived units.
• Workpiece
• Workpiece is the object to be measured/measured part
• Variations in geometry and surface finish of the measured part
directly affect measuring system’s repeatability
• Compressible materials like plastic or nylons pose a different
type of problem that any gauge pressure will distort the
material. This can be avoided by fixing of gauging pressure as
suggested by the industry so that everyone will get uniform
results
• Instruments
• Instrument is a device with the help of which the measurement
can be done
• The instrument must be selected based on the tolerance of the
parts to be measured, the type of environment and the skill
level of operators
• It should be remembered that what type of instruments the
customer prefer to measure.
• Person
• There must be some person or mechanism to carryout the
measurement
• Modern gauges are increasingly easy to use but the failure to
adequately train the operating personnel will lead a poor
performance
• Standard
• The measurement should be performed under standard
environment
• Temperature, dirt, humidity and vibration are the four
environmental factors that influence measurement.
• Vernier scale division of vernier caliper always changes when
the measurement process is carried for ‘n’ number of times
for the same dimension. The environment is indirectly
related to temperature, humidity, conditioning etc.,
Elements of Generalized Measurement System
• Primary sensing element
• The primary sensing element receives signal of the physical
quantity to be measured as input.
• It converts the signal to a suitable form (electrical,
mechanical or other form), so that it becomes easier for other
elements of the measurement system, to either convert or
manipulate it.
• Variable Conversion Element:
• Variable conversion element converts the output of the
primary sensing element to a more suitable form. It is used
only if necessary.
• Variable Manipulation Element:
• Variable manipulation element manipulates and amplifies the
output of the variable conversion element. It also removes
noise (if present) in the signal.
• Data Processing Element:
• It processes the data signal received from the variable
manipulation element and produces suitable output.
• Data processing element may also be used to compare the
measured value with a standard value to produce required
output.
• Data Transmission System:
• Data Transmission System is simply used for transmitting data
from one element to another.
• It acts as a communication link between different elements of
the measurement system.
• Someof the data transmission elements used are
cables,
wireless antennae, transducers, telemetry systems etc.
• Data Presentation Element:
• It is used to present the measured physical
quantity in a human readable form to the observer.
• It receives processed signal from data processing element and
presents the data in a human readable form.
• LED displays are most commonly used as data presentation
elements in many measurement systems.
The liquid or gas filled temperature bulb acts as primary sensing element and
variable conversion element. It senses the input quantity(temperature) and
converts it into a pressure built up within the bulb. This pressure is transmitted
through the capillary tube (which acts as data transmission element) to a spiral
bourdon type pressure gauge. Bourdon tube acts as a variable conversion element
which converts the pressure into displacement. The displacement is manipulated as
a variable manipulation element. The pointer and scale indicate the temperature,
thus serving as data presentation elements.
Accuracy:
Accuracy may be defined as the ability of an instrument to respond to
a true value of measured variable under the reference conditions. It
refers how closely the measured value agrees with the true value.
The difference between the measured value and the true value is
know as Error of measurement.
Precision:
Precision may be defined as the degree of exactness for which an
instrument is designed or intended to perform. It refers the
repeatability or consistency of measurement when the
measurements are carried out under identical conditions
Effects of elements of metrology on Precision
and Accuracy
• Factors affecting the standard of measurement
• Coefficient of thermal expansion
• Elastic properties of a material
• Stability with time
• Calibration interval
• Geometric compatibility
• Factors affecting the workpiece to be measured
• Coefficient of thermal expansion of material
• Elastic properties of a material
• Cleanliness, surface finish, surface defects such as scratches,
waviness etc.,
• Adequate datum on the workpiece
• Thermal equalization
• Factors affecting the characteristics of an instrument
• Scale error
• Repeatability and readability
• Calibration errors
• Effect of friction, zero drift, backlash etc.,
• Inadequate amplification
• Deformation when heavy workpieces are measured
• Constant geometry for both workpiece and standard.
• Factors affecting person
• Training/skill
• Ability to select the measuring instruments and standard
• attitude towards accuracy
• Planning measurement techniques for minimum
cost, consistent with precision requirements etc.
• Factors affecting environment
• Temperature, humidity, atmosphere, pressure etc.,
• Clean surrounding and minimum vibration enhance precision
• Temperature equalization between standard,
workpiece and instrument.
• Thermal expansion effects due to heat radiation from
lights, heating elements, sunlight and people.
• Manual handling may also introduce thermal expansion.
Errors in Measurement
• An error may be defined as the difference between the
measured value and the actual value
• True value may be defined as the average value of an infinite
number of measured values
• Measured value can be defined as the estimated value of true
value that can be found by taking several values during an
experiment.
• Error in measurement=Measured value-True value
The Errors in measurement can be expressed either as an
absolute or relative error
• Absolute Error: Absolute error is the algebraic difference
between the measured value and the true value of the
quantity measured. It is further sub divided into
• True absolute error
• The algebraic difference between the measured average value and
the conventional true value of the quantity measured is called true
absolute error
• Apparent Absolute error
• While taking the series of measurement, the algebraic difference
between one of the measured values of the series of measurement
and the arithmetic mean of all measured values in the same series is
called apparent absolute error.
• Relative Error: It results as the results of the absolute error
and the value of comparison used for the calculation of
absolute error.
• Absolute error=True value-Measured Value=300-280=20 units
• Relative error=Absolute error/Measured
value=20/300=0.06=6%
Types of Errors in Measurement system
• Gross Errors
• Gross errors are caused by mistakes in using instruments,
calculating measurements and recording data results.
• Eg: The operator or person reads the pressure gauge
reading as 1.10 N/m2 instead of 1.01 N/m2
• This may be the reason for gross errors in the reported
data and such error may end up in calculation of the final
results, thus producing deviated results.
• Blunders
• Blunders are caused by faulty recording or due to a wrong
value while recording a measurement, or misreading a
scale or forgetting a digit while reading a scale.
• Measurement Error
• The measurement error is the result of the variation of a
measurement of the true value.
• The best example of the measurement error is, if electronic
scales are loaded with 1kg standard weight and the reading is
1002grams, then
• The measurement error is = (1002grams-1000grams) =2grams
• Measurement Errors are classified into two types: systematic
error and random errors
 Systematic error
• The errors that occur due to fault in the measuring device are
known as systematic errors.
• These errors can be detached by correcting the measurement
device.
• These errors may be classified into different categories.
• Instrumental Errors
• Environmental Errors
• Observational Errors
• Theoretical
 Instrumental Errors
• Instrumental errors occur due to wrong construction of the
measuring instruments.
• These errors may occur due to hysteresis or friction.
• In order to reduce these errors in measurement, different
correction factors must be applied and in the extreme condition
instrument must be recalibrated carefully.
 Environmental Errors
• The environmental errors occur due to some
external conditions of the instrument.
• External conditions mainly include pressure,
temperature, humidity or due to magnetic fields.
• To reduce the environmental errors
• Try to maintain the humidity and temperature constant in the
laboratory by making some arrangements.
• Ensure that thereshall not be any external
electrostatic or
Prepared by, S.David Blessley AP/MECH,
magnetic field around theof instrument.
Kamaraj College Engineering & Technology
 Observational Errors
• As the name suggests, these types of errors occurs due to wrong
observations or reading in the instruments particularly in case of
energy meter reading.
• The wrong observations may be due to PARALLAX.
• In order to reduce the PARALLAX error highly accurate meters are
needed: meters provided with mirror scales.
 Theoretical Errors
• Theoretical errors are caused by simplification of the model system.
For example, a theory states that the temperature of the system
surrounding will not change the readings taken when it actually
does, then this factor will begin a source of error in measurement.
 Random Errors
• These are errors due to unknown causes and they occur even when
all systematic errors have been accounted.
• These are caused by any factors that randomly
affectthe measurement of the variable across the sample.
• Such errors are norm a l l y
p a reds
P reKamaraj bmy, Sof
College a.DEngineering
lalviad Bn&el dTechnology
ssel fyoAlPl/oMwECHs, the laws of
Methods of Measurement
• Direct method
• In this method, the quantity to be measured is directly
compared with the primary or secondary standard.
• This method is widely employed in production field.
• In this method, a very slight difference exists between the
actual and the measured values because of the limitation
of the human being performing the measurement.
• Indirect method
• In this method, the value of quantity is obtained by
measuring other quantities that are functionally related to
the required value.
• Measurement of the quantity is carried out directly and
then the value is determined by using a mathematical
relationship.
• Eg: angle measurement using sine bar
• Fundamental or absolute method
• Inthismethod, the measurement is based on the
measurements base quantities to define the
of used
quantity.
• The quantity under consideration is directly measured and is
then linked with the definition of that quantity.
• Comparative method
• The quantity to be measured is compared with the known
value of the same quantity or any other quantity practically
related to it.
• The quantity is compared with the master gauge and only
the deviations from the master gauge are recorded after
comparison.
• Eg. Dial indicators
• Transposition method
• This method involves making the measurement by direct
comparison wherein the quantity to be measured(V) is
initially balanced by a known value (X) of the same quantity.
Then, ‘X’ is replaced by the quantity to be measured and
balanced again by another known value (Y). If the quantity to
be measured is equal to both X and Y, then
• V= 𝑋𝑌
• Eg. Determination of mass by balancing methods
• Coincidence methods
• In this method, a very minute difference between the
quantity to be measured and the reference is determined by
careful observation of certain lines and signals
• Eg: Vernier caliper
• Deflection method
• This method involves the indication of the value of the
quantity to be measured by deflection of a pointer on a
calibrated scale.
• Eg. Pressure measurement
• Null measurement method
• In this method, the difference between the value of the
quantity to be measured and the known value of the same
quantity with which comparison is to be made is brought to
be zero.
• Substitution method
• This method involves the replacement of the value of the
quantity to be measured with a known value of the same
quantity, so selected that the effects produced in the
indicating device by these two values are the same.
• Contact method
• In this method, the surface to be measured is touched by the
sensor or measuring tip of the instrument.
• Eg. Micrometer, Vernier calliper and dial indicator
• Contactless method
• As thename indicates, there is no direct contact
with the surface to be measured
• Eg. Tool makers microscope, profile projector
• Composite method
• The actual contour of a component to be checked is
compared with its maximum and minimum tolerance limits.
• Cumulative errors of the interconnected elements of the
component which are controlled through a combined
tolerance can be checked by this method.
• This method is very reliable to ensure interchangeability and
is effected through the use of composite GO gauges.
General characteristics in metrology
Sensitivity: It is the ratio of the magnitude of output signal to the
magnitude of input signal. It denotes the smallest change in the
measured variable to which the instrument responds.
Sensitivity=(Infinitesimal change of output signal)/(Infinitesimal
change of input signal)

If the input-output relation is linear, the sensitivity will be constant


for all values of input.
If the instrument is having non-linear static characteristics, the
sensitivity of the instrument depends on the value of the input
quantity.
Hysteresis: All the energy put into the stressed component
when loaded is not recovered upon unloading. Hence, the output of
a measurement system will partly depend on its previous input
signals and this is called as hysteresis.
Range: It is the minimum and maximum values of a
quantity for which an instrument is designed to measure/ The
region between which the instrument is to operate is called
range.
•Range = Lower Calibration Value – Higher Calibration
Value = Lc to Hc
•Span: It is the algebraic difference between
higher calibration value and lower calibration value.
• Span = Hc - Lc
•Ex: If the range of an instrument is 100̊ C to 150̊ C, its span is 150̊ C
– 100̊ C = 50̊ C
Response Time: It is the time which elapses after a sudden change in
the measured quantity, until the instrument gives an
differing from the true value by an amount less than a
indication
permissible
given error.
Speed of response of a measuring instrument is defined as the
quickness with which an instrument responds to a change in input
signal.
Repeatability: It is the ability of the measuring instrument to
give the same value every time the measurement of a given quantity
is repeated.
It is the closeness between successive measurements of the
same quantity with the same instrument by the same operator over a
short span of time, with same value of input under same operating
conditions.
Stability: The ability of a measuring instrument to retain its
calibration over a long period of time is called stability. It determines
an instruments consistency over time.
Backlash: Maximum distance through which one part of an
instrument may be moved without disturbing the other part.
Accuracy: The degree of closeness of a measurement compared to
the expected value is known as accuracy.
Precision: A measure of consistency or repeatability of
measurement. i.e. successive reading does not differ. The ability of
an instrument to reproduce its readings again and again in the same
manner for a constant input signal.
Magnification: Human limitations or incapability to read
instruments places limit on sensitiveness of instruments.
Magnification of the signal from measuring instrument can make it
better readable.
Resolution: Minimum value of input signal required to cause an
appreciable change or an increment in the output is
resolution/ Minimum value that can be measured when the
called
instrument is gradually increased from non-zero value.
Error: The deviation of the true value from the desired value is
called error.
Drift: The variation of change in output for a given input over a
period of time is known as drift.
Threshold: Minimum value of input below which no output can be
appeared is known as threshold.
Reliability: Reliability may be explicitly defined as the probability
that a system will perform satisfactory for at least a given period of
time when used under stated conditions. The reliability function is
thus same probability expressed as a function of the time period.
Types of standards

Types of standards
• Line standard
• Standard yard
• Standard metre
• End standard
• End bar
• Slip gauges
• Wavelength standard
Line standard
 The measurement of distance may be made between
two parallel lines or two surfaces.
 When a length is measured between as the distance
between centres of two engraved lines, it is called line
standard
Standard yard
• Yard is made of a one inch square cross section
bronze bar and is 38 inches long
• The bar has a round recess (gold plug) of 0.5 inches
diameter and 0.5 inches deep. The gold plug is 1
inch away from both the ends
• The highly polished top surfaces of these plugs
contain three transversely and two longitudinally
engraved lines lying on the neutral axis
• The yard is the distance between two central
transverse lines on the plugs when the temperature
of the bar is constant at 62OF
• To protect the gold plug from accidental damage, it
is kept at the neutral axis as the neutral axis
remains unaffected even if the bar bends
Standard metre
• The metre is the distance between the centre portions
of two lines engraved on the polished surface of bar
made up of platinum (90%) and iridium (10%) having a
unique cross section
• The web section gives maximum rigidity and economy
in the use of costly material
• The upper surface of the
web is inoxidizable and
needs a good finish for
quality measurement
• the bar is kept at 0OC and
under
normal atmospheric
pressure.
End standard
 The need of an end standard arises as the use of line standards and
their copies was difficult at various places in workshops
 These are usually in the form of end bars and slip gauges
• End bar
• End bars made of steel having cylindrical section of 22.2 mm diameter
with the faces lapped and hardened at the ends are available in various
lengths.
• Flat and parallel faced end bars are firmly used as the most practical
end standard used for measurement.
• These are used for measurement of larger sizes
• Slip gauges
• Slip gauges are rectangular blocks of hardened and stabilized high grade
cast steel
• The length of a slip gauge is strictly the dimension which it measures
• The blocks after being manufactured are hardened to resist wear and
are allowed to stabilize to release internal stresses
• A combination of slip gauges enables the measurements to be made in
the range of 0.0025mm to 100mm but in combinations with end bars,
the measurement range upto 1200mm is possible.
Wavelength standards
• Line and end standards are physical standards and
are made up of materials that can change their size
with temperature and other environmental
conditions
• In search for such suitable unit of length, wave
length source is established
• Laser is used as primary level wavelength
standard
• According to this standard, a metre is defined as
equal to 1650763.73 wavelength of the red orange
1 metreof krypton
radiation =1650763.73 wavelengths
isotope gas
1 yard =0.9144m
=0.9144x1650763.73
=1509458.3 wavelength.

You might also like