You are on page 1of 73

Engineering

Metrology
Fair Use Notice

The material used in this presentation i.e., pictures/graphs/text, etc. is solely


intended for educational/teaching purpose, offered free of cost to the students for
use under special circumstances of Online Education due to COVID-19 Lockdown
situation and may include copyrighted material - the use of which may not have
been specifically authorized by Copyright Owners. It’s application constitutes Fair
Use of any such copyrighted material as provided in globally accepted law of
many countries. The contents of presentations are intended only for the attendees
of the class being conducted by the presenter.
Metrology
Metrology is the science of measurement, Metrology
includes all theoretical and practical aspects of measurement.

Measurement is the process of comparing unknown


magnitude of certain parameter with the known predefined
standard of that parameter.

A dimension is a numerical
value expressed in
appropriate units of measure
and used to define size,
location, orientation, form or
other geometric
characteristics of the part.
Standard
A standard is an exact quantity that people agree
to use for comparison.
Example: The measuring unit of length is Meter.
Originally, in 1889 a Meter is defined as the
distance between two lines on the specific bar
(standard) which is made platinum-iridium rod to
represent the length of a Meter.
In 1960 a Meter was redefined as 1,650,763.73 wavelengths of a
particular orange light emitted by the gas krypton 86 (a rare gas).

Measuring Temperature
• Temperature, dirt, humidity and vibration are the four major
environmental factors that influence measurement.
• For making fine measurements with precision instruments, temperature
control is very important.
• The standard measuring temperature is 20 degree C, and all
instruments are calibrated at this temperature.
Metrology- Significance
1. Increasing our knowledge and understanding of the world
2. Converting physical parameters into meaningful numbers
3. To determine the true dimensions of a part i.e.
• To ensure the material, parts and components conform to the established standards
• Take the decision to perform rework on defective parts, that is, to assess the
possibility of making some of these parts acceptable after minor repairs.
• Pre-Process & In-process inspection to produce the parts having acceptable
quality levels
• Post-process inspection to verify dimensional tolerance
4. Establishing uncertainty of measurement & Investigating the causes of
measurement errors and subsequently eliminating them.
5. Helps in
I. Evaluating the performance & response of a system.
II. Testing if the elements that constitute the system function as per the
design.
III. To establish the validity of design
IV. Ensuring public health and safety.
7. To ensure interchangeability with a view to promoting mass production.
Engineering/Dimensional Metrology
Engineering/Dimensional metrology is that branch of Metrology
which deals with measurement of “dimensions“ of a part or workpiece
(Such as, length, thickness, diameter, taper, angle, flatness,
profile etc.)
The most common quantities and geometric features that typically
are measured in engineering practice and in products made by the
manufacturing processes are,
Engineering/Dimensional Metrology- Terminologies
Accuracy:
• Accuracy is the degree of closeness/conformity of a measured quantity to
its actual value.
Precision
• Precision is the degree of closeness of repeated measurements under
same conditions.
– Precision expresses the degree of reproducibility
and repeatability or agreement between repeated
measurements.

Error:
The difference between true value and measured value is known as
measurement error.
Error = Vt – V m
Terminology associated with metrology
Repeatability
• Variation that occurs when repeated measurements are
made of the same item under absolutely identical conditions
over a short time period,
Same measurement process–Same operator–Same instruments
Same environmental conditions

Reproducibility
• Variation that results when different conditions are used to
make the measurements over longer time periods,

Same measurement process –Different operators–Different


instruments–Different environmental conditions
Errors in measurement
• Measurement is the base of scientific study
• All measurements are, however, approximate values (not
true values) within the limitation of measuring device,
measuring environment, process of measurement and
human error.
• We seek to minimize uncertainty and hence error to the
extent possible.

Errors are broadly classified as:


1. Systematic error
– Unidirectional (Either higher or lower) & Consistent
2. Random error
– Bi-directional (May be higher or lower both) & Unpredictable
3. Least count error
4. Gross error / Mistakes / Blunders
Errors in measurement
1. Systematic error
• Unidirectional, either the measurement is larger or smaller than
true value.
• Consistent, If you repeat the experiment, you’ll get the same error.
Cause 1a- Faulty instrument/Zero error
– Faulty/worn out instrument. For example, a plastic tape measure
becomes slightly stretched over the years, resulting in
measurements that are slightly too high
– A zero error results when the zero mark of the scale does not
match with pointer.
How to Minimize
– Replacement of the instrument
– Calibration of the instrument
– Change in the design of the instrument
Errors in measurement
1. Systematic error
Cause 1b- Procedural error/faulty measuring process
– Lack of understanding of the measurement process
• Inappropriate physical environment
– e.g. Measure the mass of a small quantity of gold on a
weighing scale with a ceiling fan blowing air, the air may cause
the scale to read incorrectly.
• Inappropriate physical procedure
– e.g. While taking temperature of human body, it is important to
know which of the human parts is more representative of body
temperature.
How to Minimize
Periodically assessing & conducting an audit of the measuring
process in the light of new facts and advancements.
Errors in measurement
1. Systematic error
Cause 1c- Personal Error/ Parallax Error
– A personal error is introduced by human
habits, which are not conducive for accurate
measurement.
• Parallax is a displacement or
difference in the apparent position
of an object viewed along two
different lines of sight, and is
measured by the angle or semi-
angle of inclination between those
two lines
e.g. Consider for example, the reading habit of a person. He or
she may have the habit of reading scales from an inappropriate
distance and from an oblique direction. The measurement,
therefore, includes error on account of parallax.
How to Minimize ???
Proper training regarding the use of measuring instrument
2. Random error
• Bi-directional i.e. Some of the measured values are greater than
true value; some are less than true value.
• Unpredictable

Causes
– Sudden change in experimental conditions
– e.g. Sudden change in temperature, Wind Speed, humidity,
fluctuation in potential difference (voltage).

How to Minimize ???


– It is an accidental error and is beyond the control of the person
making measurement.
– It is possible to minimize this type of error by repeating
measurements and applying statistical technique to get
closer value to the true value
• e.g. Using an average measurement from a set of measurements,
or Increasing sample size.
3. Least count error
The least count of a device is equal to the smallest division on the
scale.
Consider the steel Ruler that we used in drawing .
What is its least count?
0.5 mm or 1 millimeter (mm) ?
1/32 inch or 1/64 inch ??
How to Minimize ???
Using an instrument with smaller least count

4. Gross error
• Typical mistakes include reading/noting the wrong
numbers from a measuring instrument.
• Mistakes are sometimes called gross errors or blunders.
How to Minimize ???
Type of measurement and instrument used
Linear measuring instruments
Linear measurement, The distance between the two given points or objects, Such
as lengths, diameters, heights and thickness.

Measuring Instruments

Mechanical Instruments Electrical Instruments Electronic Instruments

A- Direct measuring instruments e.g. Rule/Ruler, Measuring Tapes


B- Indirect measuring instruments e.g. Outside/Inside calipers, telescopic gauges

A- Line Standards, length is measured as the distance between centers of two


engraved lines e.g. Rule/Ruler, Measuring Tapes
B- End standards, length is expressed as the distance between two flat parallel
faces. e.g. Micrometers, Vernier calipers

A- Graduated instruments, e.g. Vernier calipers, micrometers


B- Non-graduated instruments, e.g. Outside/Inside calipers, wire gauges, radius
gauges, thickness gauges etc.

A- Non-precision type instruments e.g. Rule/Ruler, Measuring Tapes, Outside/


Inside calipers, divider; telescopic gauge etc.
B- Precision type instruments e.g. Vernier caliper, micrometer, slip gauges, etc.
Linear measuring instruments
Electrical Instruments (Mainly Transducers)
• Basic principle Transducers, A device that converts variations in a physical
quantity, such as pressure or brightness, into measurable electrical parameter/
signal (voltage and current), or vice versa.

EXAMPLE, A Strain gauge is a sensor whose resistance varies with applied force; It converts
force, pressure, tension, weight, etc., into a change in electrical resistance which can then be measured.

EXAMPLE, Linear Variable Differential Transformer (LVDT ) is an electromechanical transducer,


LVDT is an acronym for Linear Variable Differential Transformer that convert the rectilinear motion of an
object to which it is coupled mechanically into a corresponding electrical signal.
Linear measuring instruments
• Non-precision instruments
– Steel Rule, Bar or Tape (Direct reading)

Rule, measured linear distance represents actual


linear measurements
Scale, measured linear distance represents another
imaginary (smaller or larger) linear measurements
e.g. Isometric Scale, Architect's scale
• Precision instruments
– Vernier Calipers (Direct reading)
– Micrometers (Direct reading)
– Height gauges (Direct reading)
– Slip gauges (Direct reading)
– Calipers & Divider (Indirect reading)
– Diffraction Gratings
Measuring Instruments
Presentation Sub-Topics
• Introduction
• Construction
• Working Principle
• Applications
• Error Types
• Key Characteristics
(Relevant may be included)
Precision
Accuracy
Calibration
Drift/Stability
Linearity
Magnification/Amplification
Resolution
Sensitivity
The speed of response
• Advantages &/or Limitations
Size and type of part to be measured
Environmental Conditions
The operator skills required
Cost
NON-PRECISION MEASURING INSTRUMENTS
• Non-precision instruments are limited to the measurement of parts to a visible line
graduation on the instrument used. They are used where high measurement accuracy is
not required
Steel Rule
• It is the simplest and most common measuring instruments in inspection.
• The principle behind steel rule is of comparing an unknown length to the one
previously calibrated.
• The rule must be graduated uniformly throughout its length.
• The degree of accuracy when measurements are made by a steel rule
depends upon the quality of the rule, and the skill of the user in
estimating part of a millimeter.
• there are rules that have got some attachment and special features with
them to make their use more versatile.
Calipers
• Calipers are used for measurement of the parts, which cannot be measured directly
with the scale.
• they are accessories to scales.
• The calipers consist of two legs hinged at top, and the ends of legs span part to be
inspected.
• This span is maintained and transferred to the Rule/scale.
• Calipers are of two types : spring type and firm joint type.
Spring Type
The two legs are attached with spring in this type of calipers.
The working ends of each leg of a spring calipers should be identical in shape and have
contact points equally distant from the fulcrum.
The calipers are adjusted to set dimensions by means of either a knurled
solid nut or a knurled quick action release nut operating in a finely threaded adjusting
screw.
The top portion of the legs are located in a flanged fulcrum roller and held in position
by a spring in order to maintain the alignment of the working ends.
The spring provides sufficient tension to hold the legs rigid at all points of the
adjustment.
A separate washer under the nut minimizes the friction between the adjusting nut and
the leg.
Difraction Gratting
Angle measuring instruments
Bevel protractor: This is Sine bar: Measuring with this method
a direct-reading involves placing the part on an inclined
instrument similar to a bar (sine bar) or plate and adjusting the
common protractor, angle by placing gage blocks on a surface
except that it has a plate. After the part is placed on the sine
movable element. The bar, a dial indicator is used to scan the top
two blades of the surface of the part. Gage blocks are
protractor added or
are placed in contact with removed as necessary until the top
the part being measured, surface is parallel to the surface plate. The
and the angle is read angle on the part is then calculated from
directly on the vernier trigonometric relationships.
scale.
Comparative length measuring instruments
Comparative Length Measurement. Instruments used for measuring
comparative lengths (also called deviation-type instruments) amplify and
measure variations or deviations in the distance between two or more surfaces.
The most common example is a dial indicator. A dial indicator is an instrument
used to accurately measure small distances, and amplify them to make them
more obvious, that which their naked eye cannot recognize.
They are all simple mechanical devices that convert linear displacements of a
pointer to the amount of rotation of an indicator on a circular dial. The indicator
is set to zero at a certain reference surface, and the instrument
or the surface to be measured (either external or internal) is brought into contact
with the pointer. The movement of the indicator is read directly on the circular
dial (as either plus or minus some number) to accuracies as high as 1 /am. Dial
indicators with electrical and fluidic amplification mechanisms and with a digital
readout also are available.
Measuring straightness
Measuring flatness
Measuring flatness
Measuring roundness
Measuring roundness
Measuring profile
Co-ordinate measuring machines (CMM)
INTRODUCTION
Coordinate Metrology is carrying out measurements and three-
dimensional geometric object imaging with the use of coordinate
measuring systems.

Origins of coordinate metrology may be found in works of French


Mathematicians, Pierre de Fermat (1601–1665) and Rene Descartes
(1596–1650).
Co-ordinate measuring machines (CMM)
INTRODUCTION
Coordinate Measurement Systems A mechanical system moves a measuring
probe to determine the coordinates of points on the surface of a workpiece.

The primary function of a CMM is to measure the actual shape of a work piece,
compare it against the desired shape, and evaluate the metrological
information such as size, form, location, and orientation.

Significance of Coordinate
Measurement Systems
✔ Scope of application of
these systems is constantly
growing
✔ Demand for specialists in the
field of coordinate metrology
is also growing
✔ Their number/quantity is
used as a measure of the
level of technological
development of any plant/
region/country
 
Co-ordinate measuring machines (CMM)
INTRODUCTION
First Coordinate Measuring Machines
✔ CMMs are relatively recent
developments in measurement
technology.

✔ The first coordinate measuring


machines (CMMs) were
developed by
• Scottish Ferranti (1956)
• American Moore Tool
Company (1957)*
*Image of Moore M3 CMM
• By Franco Sartorio, Italian
Engineer (1962).

✔ Development of computer
technology in 1960s, led to
construction of the first NC
coordinate machine in 1973
Co-ordinate measuring machines (CMM)
BASIC OPERATION MECHANISM
∙ Basically, they consist
of a platform on which
the work piece being
measured is placed
and moved linearly or
rotated.
∙ A probe attached to a
head capable of
lateral and vertical
movements records
all measurements.
∙ On contact, the
coordinate positions
are recorded by the
CMM controller,
adjusting for over
travel and probe size.
Co-ordinate measuring machines (CMM)
CONSTRUCTION/HARDWARE COMPONENTS
1. Probes and Sensors ,
✔ Probe attached to the head contacts the work part surface & collects data
a) Touch-Trigger probes
b) Scanning probes
c) Proximity or Non-contact Probes
2. Mechanical structure
✔ Provides motion of the probe in three Cartesian axes
3. Surface Plate/Worktable
4. Drive system and control unit to move each of the three axes
5. Computer system with application software.
Co-ordinate measuring machines (CMM)
CONSTRUCTION/HARDWARE COMPONENTS
Probes and Sensors
(a) Touch-trigger contact probe
✔ Touch-trigger probes are the most common type.
✔ They actually touch the surface of the workpiece, send a signal
with the coordinates of the point to the CMM.
✔ The computer records this contact point coordinate space
✔ An LED light and an audible signal usually indicate contact
Co-ordinate measuring machines (CMM)
CONSTRUCTION/HARDWARE COMPONENTS
Probes and Sensors
(b) Scanning Probe
✔ This method generally involves passing the prob over a target
surface at its working range.
✔ Scanning contact probes many use LVDT(Linear variable differential
transformer) or optoelectronic position sensor (uses an LED and
a position-sensitive detector that detects where the light is hitting)
✔ Used, for inspection of object being measured would be deformed
by the force of stylus
Co-ordinate measuring machines (CMM)
CONSTRUCTION/HARDWARE COMPONENTS
Probes and Sensors
(c) Proximity or Non-contact Probes
Proximity or non-contact probes function similarly to displacement
measuring probes, but they use laser, capacitive or video measurement
technology instead of LVDT’s.
Laser probes Optical/Video probes
✔ Laser probes project a light beam ✔ Optical probes gather data with
onto the surface of a part when the still-image or video cameras.
light beam is triggered, the position ✔ The feature is measured by
of beam is read by triangulation computer ‘count’ of the pixels of
through a lens in the probe the electronic image
receptor
Co-ordinate measuring machines (CMM)
MECHANICAL STRUCTURE/CONFIGURATION
1-Cantilever 2-Bridge type (Fixed) &
∙ Initial design of Bridge type (Moving)
Ferranti in Scotland in ∙ Most widely used
the 1970s ∙ High Accuracy
∙ High Accuracy ∙ Moderate Flexibility
∙ Lowest Flexibility ∙ Medium Size Parts, from
∙ Small parts 300×300×300 mm XYZ to
2000 mm x 5000 mm
x1500mm
Co-ordinate measuring machines (CMM)
MECHANICAL STRUCTURE/CONFIGURATION
3-Gantry type 4-Horizontal arm
High accuracy

type
Highest Flexibility
Less accuracy

Not portable

Highest Flexibility

Expensive foundation work


Large Size Parts Size


For extremely large parts


Range 2 x 2 x 1m XYZ to

Size Range 1 x 2 x 1m XYZ


4 x 10 x 3m XYZ

to 4 x 10 x 3m XYZ
Co-ordinate measuring machines (CMM)
APPLICATIONS
The accuracy of these machines today is closer to 1µm (0.001mm)
Resolution of present day co-ordinate machine is 5 microns.
Dimensional measurements, Measurements made in the x, y, and z directions.
Profile measurements, Information (2D or 3D) about the form or profile of an object..
Angularity or orientation, Measurements are made to capture angle information between
points on an object.
Depth mapping
✔ Measuring the difference between two stereo images. Stereo images are successive images
of the same scene taken at slightly different angles.
✔ A depth map is then created, resulting in a single image using different intensities to represent
the different depths.
Digitizing or imaging Provides a digital format or image to visually capture the geometry of
the workpiece from the measurements made by the CMM.
Shaft measurements, Measurements made by CMMs designed specifically for inspecting
shafts.
Research and Development
Rapid Prototyping (RP)
Reverse engineering (RE)
Health care, such as CT (computed tomography) Scan.& MRI (magnetic resonance imaging)
Inventory of works of art and in monument protection (virtual museums), architecture,
and development of building construction.
Co-ordinate measuring machines (CMM)
GENERAL CAUSES OF ERRORS
∙ The table and probes misalignment.

∙ Wear of components
✔ i.e. the guide ways, the scales, the probe system and sphere

∙ Environment in which the CMM operates


✔ The ambient temperature, temperature gradients, humidity and
vibration

∙ The probing strategy used


✔ The magnitude and direction of the probe force, the type of probe
stylus used and the measuring speed of the probe

∙ Characteristics of the workpiece


✔ Elasticity, surface roughness, hardness and the weight of the
component, the work piece must not exceed maximum weight limit.

∙ Perpendicularity error
✔ Occurs if three axes are not orthogonal.
Co-ordinate measuring machines (CMM)
ADVANTAGES
Versatility
A CMM is a general-purpose machine that can be used to inspect a variety of
part configurations
No special fixtures or gages are required.
Because probe contact is light, most parts can be inspected without being
clamped to the table.
Single Setup
As probe moves along 3 axes, most parts can be inspected in a single setup,
thus eliminating the need to reorient the parts for access to all features.
Reduced Setup Time
Part alignment and establishing appropriate reference points are very time
consuming with conventional surface plate inspection techniques. Software
allows the operator to define the orientation of the part on the CMM, and all
subsequent data are corrected for misalignment between the part’s reference
system and the machine coordinates.
Improved Productivity
The above-mentioned advantages help make CMMs more productive than
conventional inspection techniques. Furthermore, productivity is realized
through the computational and analytical capabilities of associated data –
handling systems, including calculators and all levels of computers
Co-ordinate measuring machines (CMM)
ADVANTAGES
Flexibility
CMMs are essentially universal measuring machines and need not be dedicated
to any particular task. They can measure almost any dimensional characteristic
of a part configuration, including cams, gears and warped surfaces.
Improved Accuracy
All measurements in a CMM are taken from a common geometrically fixed
measuring system, eliminating the introduction and the accumulation of errors
that can result with hand - gage inspection methods and transfer techniques.
Reduced Operator Influence
The use of digital readouts eliminate the subjective interpretation of readings
common with dial or vernier type measuring devices. Operator “feel” is virtually
eliminated with modern touch - trigger probe systems.
Automatic data recording
In addition, automatic data recording is available on most machines, prevents
errors in transcribing readings to the inspection report. This adds up to the fact
that less skilled operators can be easily instructed to perform relatively complex
inspection procedures.
Measuring Instruments
Presentation Sub-Topics
• Introduction
• Working Mechanism
• Construction
• Applications
• Error Types
• Key Characteristics
(Relevant may be included)
Precision
Accuracy
Calibration
Drift/Stability
Linearity
Magnification/Amplification
Resolution
Sensitivity
The speed of response
• Advantages &/or Limitations
Size and type of part to be measured
Environmental Conditions
The operator skills required
Cost
Gages
Gages have simple solid shapes and cannot be classified as
instruments although they are very valuable in metrology. Gage
blocks are individual square, rectangular, or round blocks of
various sizes. For general use, they are made from heat-treated and
stress-relieved alloy steels. The better gage blocks are made of
ceramics (often zirconia) and chromium carbide-unlike steels,
these materials do not rust, but they are brittle and must be
handled carefully. Environmental temperature control is important
when gages are used for high-precision measurements.
Fixed Gages. These gages are replicas of the shapes of the parts to be measured.
Although fixed gages are easy to use and inexpensive, they
indicate only whether a part is too small or too large compared with an established
standard.
Plug gages are commonly used for holes. The GO gage is smaller than the NOT
GO (or NO GO) gage and slides into any hole that has a dimension smaller than the
diameter of the gage. The NOT GO gage must not go into the hole. Two gages are
required for such measurements.
Ring gages: are used to measure shafts and similar round parts.
Angle gage blocks: Angle Gage Blocks permit fast, simple, and accurate
measurement of angles. A set of only 16 blocks will combine to make 356,000
different angles from 0° to 99° in steps of 1 second to an accuracy of less than
1/1,000,000th of a circle.
Gages
General characteristics and selection of
measuring instruments
The selection of an appropriate measuring instrument for a particular application also
depends upon the general characteristics and other factors
• Precision: Degree to which a measuring instrument gives repeated measurement of the same
standard.
• Accuracy: The degree of agreement of the measured dimension with its true magnitude.
• Repeat accuracy: The same as accuracy but repeated many times.
• Calibration: The adjustment or setting of a measuring instrument to give readings that are accurate
within a reference standard.
• Drift/ Stability : An instrument’s capability to maintain its calibration over time; also called stability.
• Linearity: The accuracy of the readings of a tool over its full working range.
• Magnification/Amplification : The ratio of measuring instrument output to the input dimension;
also called amplification.
Resolution: The smallest unit of measurement that can be indicated by an instrument.
A measuring tape for example will have a resolution, but not sensitivity.
• Sensitivity: The smallest amount of difference in quantity that will change an instrument's reading. An
analytical balance will have both resolution & sensitivity, its sensitivity requires it to be protected by a
draft shield or an enclosure
• Rule of 10 (gage maker’s rule): An instrument or gage should be ten times more accurate than the
dimensional tolerances of the part being measured.
• The speed of response: How rapidly a measuring instrument indicates a measurement, particularly
when some parts are measured in rapid succession.
Selection of measuring instrument also depends upon
• Size and type of part to be measured
• The environment (temperature, humidity, dust, pressure, and so on)
• The operator skills required
• The cost of equipment
Standardization & Interchangability
Standardization & Interchangeability

Interchangeability:
Interchangeability occurs when one part in an assembly can be substituted for
a similar part which has been made to the same drawing. Interchangeability is
possible only when certain standards are strictly followed.
Why Tolerance?
TOLERANCES NEED TO BE DEFINED BECAUSE WE LIVE IN A PROBABILISTIC WORLD AND
100% REPRODUCIBILITY IN MANUFACTURING IS NOT PHYSICALLY POSSIBLE. YOU CAN
THINK OF PARTS AS HAVING VARYING DEGREES OF IMPERFECTION.
VARIATIONS IN PROPERTIES OF THE MATERIAL BEING MACHINED INTRODUCE ERRORS.
WHEN YOU LOOK AT MACHINED PARTS, THEY LOOK FLAT AND STRAIGHT, BUT IF YOU
INSPECT WITH PRECISE AND ACCURATE INSTRUMENTS, YOU WILL FIND THAT THERE ARE
IMPERFECTIONS ALL OVER THE PARTS. THESE VARIATIONS (IMPERFECTIONS) ARE
ALLOWED WITHIN THE TOLERANCE LIMITS (CONSTRAINTS) PLACED ON THE PARTS.

∙ THE PRODUCTION MACHINES THEMSELVES MAY HAVE SOME INHERENT INACCURACIES.


∙ IT IS IMPOSSIBLE FOR AN OPERATOR TO MAKE PERFECT SETTINGS. WHILE SETTING UP
THE TOOLS AND WORKPIECE ON THE MACHINE, SOME ERRORS ARE LIKELY TO CREEP IN.

TOLERANCE:
“THE TOTAL AMOUNT BY WHICH A GIVEN DIMENSION MAY VARY, OR THE DIFFERENCE
BETWEEN THE LIMITS” - ANSI Y14.5M-1982(R1988) STANDARD [R1.4]
How to specify tolerance
1-Limit tolerance
2-Plus minus tolerance
2a-Unilateral Plus minus
2b-Bilateral Plus minus
Equal
Unequal
Variation is Unavoidable
No two manufactured objects are identical in every way. Some degree of
variation will exist.
Terminalogy
1. Nominal size/Basic size. The required size of the component, before the
Limits are set, is called the Basic Size or Nominal Size.
It is the size of a part specified in the drawing as a matter of convenience.
It is the size in relation to which all limits of size are fixed and will be the
same for both the male and female parts of the fit.
2. Actual size. It is the actual measured dimension of the part. The difference
between the basic size and the actual size should not exceed a certain limit,
otherwise it will interfere with the interchangeability of the mating parts.
3. Limits of sizes. There are two extreme permissible sizes for a dimension of
the part. The largest permissible size for a dimension of the part is called
upper or high or maximum limit, whereas the smallest size of the part is known
as lower or minimum limit.
4. Allowance. It is the difference between the basic dimensions of the mating
parts. The allowance may be positive or negative. When the shaft size is less
than the hole size, then the allowance is positive and when the shaft size is
greater than the hole size, then the allowance is negative.
5. Tolerance. It is the difference between the upper limit and lower limit of a
dimension. In other words, it is the maximum permissible variation in a
dimension. The tolerance may be unilateral or bilateral.
Tolerance & Types of tolerance
TYPES OF GEOMETRIC TOLERANCE
THERE ARE THREE BASIC TYPES OF GEOMETRIC TOLERANCES:
1. FORM TOLERANCES:
STRAIGHTNESS, FLATNESS, ROUNDNESS, CYLINDRICITY
2. ORIENTATION TOLERANCES:
PERPENDICULARITY, PARALLELISM, ANGULARITY
3. POSITION TOLERANCES:
POSITION, SYMMETRY, CONCENTRICITY
Tolerance & Types of tolerance
TYPES OF GEOMETRIC TOLERANCE
THERE ARE THREE BASIC TYPES OF GEOMETRIC TOLERANCES:
1. FORM TOLERANCES:
STRAIGHTNESS, FLATNESS, ROUNDNESS, CYLINDRICITY
2. ORIENTATION TOLERANCES:
PERPENDICULARITY, PARALLELISM, ANGULARITY
3. POSITION TOLERANCES:
POSITION, SYMMETRY, CONCENTRICITY
Tolerance & Types of tolerance
TYPES OF GEOMETRIC TOLERANCE
THERE ARE THREE BASIC TYPES OF GEOMETRIC TOLERANCES:
1. FORM TOLERANCES:
STRAIGHTNESS, FLATNESS, ROUNDNESS, CYLINDRICITY
2. ORIENTATION TOLERANCES:
PERPENDICULARITY, PARALLELISM, ANGULARITY
3. POSITION TOLERANCES:
POSITION, SYMMETRY, CONCENTRICITY
Tolerance & Types of tolerance
TYPES OF GEOMETRIC TOLERANCE
THERE ARE THREE BASIC TYPES OF GEOMETRIC TOLERANCES:
1. FORM TOLERANCES:
STRAIGHTNESS, FLATNESS, ROUNDNESS, CYLINDRICITY
2. ORIENTATION TOLERANCES:
PERPENDICULARITY, PARALLELISM, ANGULARITY
3. POSITION TOLERANCES:
POSITION, SYMMETRY, CONCENTRICITY
Fit and Types of Fits
Fit
Degree of ease or difficulty in assembly condition between two mating
parts i.e. ‘Hole’ & ‘Shaft’

Clearance fit:
In this type of fit, the largest permitted shaft diameter is less than the smallest
hole diameter so that the shaft can rotate or slide according to the purpose of
the assembly.
Ex: door hinges, wheel, and axle, shaft and bushed bearing,
Fit and Types of Fits
Interference Fit:
It is defined as the fit established when a negative clearance exists between
the sizes of holes and the shaft. In this type of fit, the minimum permitted
diameter of the shaft is larger than the maximum allowable diameter of the
hole. In case of this type of fit, the members are intended to be permanently
attached.
Ex: Bearing bushes, Keys & keyways, Wooden Wheel & Outer Metallic
ring, Tire & rim
Fit and Types of Fits
Transition Fit:
In this type of fit, the diameter of the largest allowable hole is greater than the
smallest shaft, but the smallest hole is smaller than the largest shaft, such that a
small positive or negative clearance exists between the shaft & hole.
Ex: Coupling rings, Spigot in mating holes, etc.
Fit and Types of Fits
Description
Free Running
More Clearance Loose Running
(Close to Top of
Easy Running
The Chart) Clearance Fits
Sliding
Close Clearance
Locational Clearance
Location- slight interference
e.g. hubs of gears and pulleys
Transition
Location/Transition
Fits
e.g. Armatures of electric motors on
More Interference shafts
(Close to Bottom of
Location/Interference
The Chart) Interference
Fits Medium Drive Fit
Force Fit
Hole and Shaft Basis System
Book- a text book of production engineering, By: Dr. P. C Sharma
Limits and fits standard systems of limits and fits

You might also like