You are on page 1of 120

Introduction to Metrology

Metrology
• Derived form Greek words Such as Metro –
Measurement & Logy - Science.
• BIMP (Bureau of Weights and Measures) – “
The Science of Measurement, embracing both
experimental & theoretical determinations at
any level of uncertainty in any field of science
& technology”
Functions of Metrology
• Establishing units of measurement
• Reproducing these units as standards
• Ensuring the uniformity of measurement
• Development of method of measurement
• Analysis of accuracy of methods related
to errors.
2. Standards of Measurement
1. Units of Measurement

Uniformity of Measurement
Methods of Measurement

Accuracy
Objectives of Metrology
• Used for selection of proper
measuring instrument.
• Used deciding the proper
standards.
for measuring
• Used for minimizing cost of inspection.
• Determining process capabilities.
• Decide/find tolerances.
• Achieve standardization.
• Maintain accuracy and precision at the time
of inspection.
Types of Metrology
• Legal Metrology or Scientific Metrology
• Deterministic metrology or Industrial metrology.
Legal Metrology or Scientific
Metrology
• Scientific or fundamental metrology concerns
with the establishment of quantity system, unit
of measurement, the development of new
methods etc.
• Applications
– Commercial Transactions related with net quantity.
– Industrial measurements, accuracy, interchangeability.
– Measurement of health.
– Measurement of human safety.
Industrial Metrology
• It concerns with application of measurement
science to manufacturing and other processes
and their use in society.
Inspection
• In engineering activities inspection involves
the measurements, tests, and gauges applied
to certain characteristics in regard to an object
or activity.

Careful
observation
Need of Inspection
• Quality output
• Change in technology
• Mass production
Made in China
• Save money
• Interchangeability Made in Japan
• To develop reputation
Standards of Measurement
• A standard is an exact quantity that
people agree to use for comparison.
Types of Standards
• Primary Standard
• Secondary Standard
• Tertiary Standard
• Working Standard
Primary Standard
• They are material standard preserved under
most careful conditions.
• These are not used for directly for
measurements but are used once in 10 or 20
years for calibrating secondary standards.
• Ex: International Prototype meter, Imperial
Standard yard.
Example : International Prototype
Meter

The bars were to be made of a special alloy, 90% platinum


and 10% iridium, which is significantly harder than pure
platinum, and have a special X-shaped cross section (a
"Tresca section", named after French engineer Henri Tresca)
to minimise the effects of torsional strain during length
comparisons.
International Prototype Weight
• The IPK is made of a platinum alloy known as “Pt-10Ir”, which
is 90% platinum and 10% iridium (by mass) and is machined
into a right-circular cylinder (height = diameter) of 39.17
millimeters to minimize its surface area.
Secondary Standard
• The value of the secondary standard quantity
is less accurate than primary standard one. It
is obtained by comparing with primary
standard.
• These are close copies of primary standards
w.r.t design, material & length.
Tertiary Standard
• Maintained in National Physics Laboratories
(NPL).
• The primary or secondary standards exist as
the ultimate controls for reference at rare
intervals.
• They are made as close copies of secondary
standards & are kept as reference for
comparison with working standards.
Working Standards

• These standards are similar in design to


primary, secondary & tertiary standards.
• But being less in cost and are made of low
grade materials, they are used for general
applications in metrology laboratories
Line Standard
• The standard in which the distance is
measured between two straight and parallel
lines.
• Example :- Steel Rule
• Advantages
– Quick and easy
– Cheaper
• Disadvantages
– Parallax error can be generated
– Accuracy Less
End Standard
• When the distance is measured between two flat
and parallel surfaces it is called as end standard.
• Example : Vernier Caliper, Micrometer
• Advantages
– Accuracy of 0.005mm is measureable
– Minimum possibility of parallax error
• Disadvantages
– They are costly and difficult to use
– The measuring surfaces are to be protected.
Wavelength Standard
• Material Standards are subjected to errors due
to continuous use because of this wavelength
standards are used.
• Light sources like Cadmium 114, Krypton 86
etc. in hot cathode discharge lamp at 68 K.
• According to this
– 1m = 16507623.73 x Wavelength of Kr 86
Krypton 86
Wavelength Experiment
Difference between line, end and wavelength
standard
Sr. No Parameter Line Standard End Standard Wavelength Standard

1 Accuracy Limted ± 0.2 mm Good ±0.001 Highest


2 Time of Quick Sligthly more Time consuming
Measurement than line
3 Effect of Use Diffecult to Ends are harden Depends on
assume zero wavelength
4 Errors Parallax Wringing errors, No error
environmental
error, wear
and tear
5 Process Easy Slightly difficult Highly difficult
6 Cost Low Medium High
7 Example Steel rule Vernier, Slip Monochromatic light
gauges source
8 Care Engraved lines Measuring Not affected
must be protected ends
must be protected
Slip Gauges
Slip gauges are rectangular blocks of steel having a
cross section about 30 x 10 mm.
Selecting Slips
• Minimum number of slip gauges chosen for
combination of blocks depending on the type of set
available.
Standard Sets
Set of M45
Range (mm) Step (mm) Pieces
1.001 to 1.009 0.001 9
1.01 to 1.09 0.01 9
1.1 to 1.9 0.1 9
1 to 9 1 9
10 to 90 10 9
Total 45

Set of M87
Range (mm) Step (mm) Pieces
1.001 to 1.009 0.001 9
1.01 to 1.49 0.01 49
0.5 to 9.5 0.5 19
10 to 90 10 9
1.005 - 1
Total 87
Steps
• Example : Arranging dimension of 56.421mm
1. Consider last decimal first i.e. 0.001mm.
2. Since the gauge of 0.001 is not
available, select 1.001mm slip gauge
3. 56.421 – 1.001 = 55.42mm dimension now
we have to adjust.
4. Consider second decimal point i.e.
0.02mm
5. Select slip gauge 1.02mm
6. There for 55.42 – 1.02 = 54.40mm remains
7. We have slip gauge of 1.4mm, 3.0mm there
fore,
8. 54.40-3.0-1.4 = 50mm dimension we have to
adjust.
9.Finally choose 50mm slip gauge
10.Thus, we have
50.000+3.000+1.400+1.020+1.0
01 = 56.421.
Accessories of slip gauges
• Various slip gauges accessories are as shown
in following figure and listed as
1. Measuring jaws
2. Scribing and center point
3. Holders and base
• Holder can be used in combination of slip
gauges for measuring overall length, internal
and external dimensions. The holder provides
proper grip to slips.
• After selection of slips are taken in the holder
and then gripping pressure is applied and
measurement can be started.
Care of slip gauges
1. All surfaces are protected by applying anti corrosive jelly.
2. They are prevented from dust and dirt.
3. When not in use they should be kept at their proper places.
4. Gauges are used in air conditioned rooms which
protects them from thermal expansions
5. Gauges are to be demagnetized for avoiding attract of metal
dust.
6. They should be handled as minimum as possible.
7. Handling should be done using hand gloves.
8. Never they should be kept on working surfaces.
9. After use remove finger marks and wiped gauges are placed in
box.
10. Slip gauges are not kept in the wrung position for unnecessary
length of time.
Static & Dynamic Characteristics of
Measuring Instrument
• Precision • Amplification
• Accuracy • Magnification
• Sensitivity • Reproducibility
• Hysteresis • Drift
• Speed of Response • Resolution
(Response Time) • Interchangeability
• Repeatability
• Calibration
Accuracy & Precision

Targe
t

Case 1 – Low Accuracy Low Precision


Accuracy & Precision

Targe
t

Case 2 – Low Accuracy High Precision


Accuracy & Precision

Targe
t

Case 3 – High Accuracy Low Precision


Accuracy & Precision

Targe
t

Case 3 – High Accuracy High Precision


Accuracy
• Closeness of measured value with true value
• Can be determined by single reading
• For example, if in lab you obtain a weight
measurement of 3.2 kg for a given substance, but
the actual or known weight is 10 kg, then your
measurement is not accurate. In this case, your
measurement is not close to the known value.
• Eg. A job having dimension 25mm and instrument
showing reading 24.98mm
Precision
• Defined as repeatability of measuring
instrument i.e. how close the measured values
are to each other..
• Can not be determined by single reading i.e. for
describing precision a set of readings required.
• E.g. reading obtained from measuring instrument
– True reading – 25mm
– 24.7 , 25.31, 24.69, 24.89, 25.02 - Set 1
– 24.98, 25.02, 25.01, 25.00, 25.00 – Set 2
comparison
• The ratio of change in output of an instrument
to smallest change in input.
• Example: -
– If the sensitivity of the voltmeter is say 1mv then if
you apply a potential difference 1mv the display
moves. If you apply less than 1mv the display dose
not moves.
Linearity
• Proportional Relationship between input and
output of measuring instrument is linearity.
Hysteresis
It is difference between the indications of measuring instrument when the
same value of the measured quantity is reached by increasing or by
decreasing that quantity

Input Upscale Downscale Ideal Ideal


Weigh up Dn
t
1 Kg 1 Kg 5.02 Kg 1 Kg 5 Kg
2 Kg 2.03 Kg 3.99 Kg 2 Kg 4 Kg
3 Kg 3.1Kg 2.89 Kg 3 Kg 3 Kg
4 Kg 4.05 Kg 1.99 Kg 4 kg 2 Kg
5 Kg 5.03 kg 0.92 Kg 5 Kg 1 Kg

4.05 – 3.99 = 0.06 kg


Repeatabilit
y
• Repeatability or test–retest reliability is the
variation in measurements taken by a single
person or instrument on the same item,
under the same conditions, and in a short
period of time.
• A measurement may be said to be repeatable
when this variation is smaller than a pre-
determined acceptance criteria.
Reproducibility
• It is the quantitative measurement of the
closeness of the agreement between the
results of measurement of the same
measurand, carried out by changing the
method of measurement, observer, measuring
instrument, location, condition of use, time
etc.
Calibration
• Process through which the reliability of
instrument is maintained by comparing the
instrument with known standards.
• Need
To takeinto account the error
producing properties of each component.
 To take into account environmental errors.
To understand/detect error generated
due to frequent use of equipment
To assure quality of inspection
during manufacturing process.
Calibration Example
Magnification
Increasing magnitude by using some mechanical, optical
or pneumatic arrangement so that instrument becomes more
readable
Amplification
Electronic method of magnification which is ideally suitable
for processing of signals called as amplification
Drift
• It is an instrument's ability to maintain it’s
calibration over a period of time
• If an instrument does not reproduce the same
reading at different times of measurement for
the same input signal, it is said to have drift. If
an instrument has perfect reproducibility, it is
said to have no drift.
Range
• The region between the limits within which an
instrument is designed to operate for
measuring it input quantity called range of
instrument.
• Range is expressed by stating upper and
lower values.
• E.g. thermometer range -100oC to 100oC
Span
• Algebraic difference between upper and lower
range values of instrument.
• E.g. span of thermometer having range -100
to 100 oC
• Span = Upper Value – Lower Value
• Span = 100-(-100) = 200 oC
Dead Time
• It is the time lag
between a input given
to an instrument and
the resulting output...
Dead Zone
• Largest change of input quantity for there no
output
• E.g. when input is applied to instrument may
not be sufficient to overcome the friction. It
will only response when it will overcome
friction.
Resolution
Ability of instrument which can read smallest dimension of
measurement
Speed of Response
• The speed of response of measuring
instrument is defined as the quickness with
which an instrument responds to a change in
the input
signal.
Overshoot
Interchangeability
Errors
• Difference between measured value and true
value.
• Types of Error
– Systematic Error
– Random Error
Systematic Error
• Systematic errors in experimental
observations usually come from the
measuring instruments. They are controllable
in nature
• They may occur because:
• There is something wrong with
the instrument or its data handling system, or
• Because the instrument is wrongly used by the
experimenter.
Example of Systematic Error : Parallax
Error
Random Error
• These errors occurs randomly hence they can
not be eliminated but their intensity can be
minimized.
• Example:
– Positioning standard or work piece, slight
displacement of the jaws, fluctuation of
instrument, operator error etc.
Sources of Errors
• Defect in the instrument
• Adjustment of instrument
• Imperfection of instrument design.
• Method of location of instrument
• Environmental effects
• Error because of properties of work piece
• Error due to surface finish of object
• Error due to change in size of object
Environmental Error
• 25mm steel length will increase by 0.3
microns if there is change in 1 deg. Cent.
Temperature.
• Standard Humidity and temperature
• 20 deg. Centigrade at 35 to 45 % RH.
Dirt Error
• Dirt particles can enter in the inspection room
through door, windows etc. these particles can
create small change or errors at the time of
measurement. For this only, various
laboratories are to be in the dust prof rooms.
Support Error

Supports too close

Supports too long

Proper support
Contact Pressure Error
Sine Error

sin ϴ = e/(d/2 ) ∴e= d/2 sinθ

ϴ - small angle by which line of measurement


get’s inclined.
e – error at one side.
Cosine Error

∴Error=𝑙 − 𝐿 = 𝑙 − 𝑙 × 𝑐𝑜𝑠𝜃
= 𝑙(1 − 𝑐𝑜𝑠𝜃)
Contact Error
Parallax Error
Classification of Methods of
Measurement
• Direct measurement
• Indirect measurement
• Fundamental measurement
• Comparison measurement
• Null method measurement
• Deflection measurement
• Contact measurement
• Contactless measurement
Direct measurement
Indirect measurement
Fundamental measurement
Comparison measurement
Null method measurement
Deflection measurement
Contact measurement
Contactless measurement
Classification of Measuring Instruments
Linear Measurement
• Linear measurement means • Non
measurement
Precision between – Steel Rule
points
two or planes. It is – Slip gauge
related
basically with distance – Caliper and scale
between them using line or end – Feeler gauge etc.
standard.
• Precision
– Vernier caliper
– Vernier height gauge
– Vernier depth gauge

Linear Measurement

Micrometer
– Inside micrometer
Linear Measurement – Depth micrometer
• Supporting Devices
• Equipment's for linear
measurement are
Steel Rule
Engineering Square
Radius Gauges
Feeler gauge
Pitch gauge
Calipers
Telescopic gauge

The gauge is removed from measuring cavity and dimension is measured with the
aid of a micrometer or caliper.
Combination Set

This instrument is most commonly used in layout and inspection


purpose. On the square head and protractor head sprit levels are
mounted for test the surface for parallelism and check on
indication.
Applications of Combination set
Straight Edge
A

Section A - A
Universal Surface Gauge
Verniers
• When two scales have slightly difference in
sizes are use, the difference between them
can be utilized for increase in accuracy of
measurement.
• Types of Verniers
– Simple Vernier
– Vernier Height Gauge
– Venire Depth gauge
– Micrometer
– Depth micrometer
– Inside micrometer
Simple Vernier
• By using vernier, following measurements can
be done
– Outside dimensions (Outside jaws in use)
– Inside dimensions (Inside Jaws in use)
– Depth (Using depth bar)
– Step Measurement (Using step surface)
Vernie
r
Calculating Least Count
𝑆𝑚𝑎𝑙𝑙𝑒𝑠𝑡 𝐷𝑖𝑠𝑣𝑖𝑠𝑖𝑜𝑛 𝑜 𝑛 𝑀𝑎𝑖𝑛
• 𝐿. 𝐶.
𝑁 𝑢𝑆𝑐𝑎𝑙𝑒
𝑚 𝑏 𝑒 𝑟 𝑜𝑓 𝐷𝑖𝑣𝑖𝑠𝑖𝑜𝑛𝑠 𝑜 𝑛 𝑉𝑒𝑟𝑛𝑖𝑒𝑟
= 𝑆𝑐𝑎𝑙𝑒
• If value of smallest division on main scale is
1mm and there are 50 numbers of divisions
on vernier scale then least count will be
1
• 𝐿. 𝐶. = 0.02𝑚𝑚
50
=
Reading Vernier

•Total Reading (TR) = Main Scale Reading


(MSR) + [Vernier Scale Division (VSD) x
Least Count (LC)]
i.e. TR = MSR + VSD x LC
TR = 18 + (23*0.02) = 18 + 0.46 = 18.46
TR = 18.46 mm
Vernier Height gauge
Vernier Depth Gauge

Work Piece
Micrometer

• Parts of Micrometer
– Frame
– Anvil and spindle
– Screwed spindle
– Graduated sleeve or barrel
– Thimble
– Ratchet or friction stop
– Spindle clamp
Micromete
r

Least Count of Micrometer is = 0.01mm (i.e. 0.5/50 => Smallest


division on main scale is 0.5mm and total divisions on vernier
scale is 50).
Micrometer Reading
1

2
Depth Micrometer
Inside Micrometer
Supporting Devices
Surface Plate

Surface Plate

Adjustable
Stand
Angle Plate
V - Block
Sprit Level

You might also like