You are on page 1of 25

Subject: Metrology and Control Engineering

(ME3053)

Chapter 1- Measurement, Tolerance and Limit


gauges
1. Introduction

• Types of Industries
• Departments in the industry
• Function of quality control department
• Metrology is a science of measurement
• Saying about any quantity.
• Units must be standard universally. (The
concept of standards in measurement
units refers to the establishment of
consistent and universally accepted
reference points for measuring various
physical quantities. )
1. Introduction

• Need to bother about accuracy


• Metrology is mainly concerned with
• Establishing units of measurement,
reproduce these units in terms of
standards and ensuring its uniformity
• Developing methods of measurement
• Analysing methods of measurement,
establishing uncertainty of measurement,
reaching to causes of measuring error and
eliminating these.
2. Need of Metrology(Metrology)
Need of inspection
• inspection of parts at each stage of
production
• in mass production need to develop
gauges.
• concept of interchangeable parts and
there acceptable tolerance zone.
Need for Measurement
 To ensure that the part to be measured conforms to the established
standard.
 To meet the interchangeability of manufacture.
 To provide customer satisfaction by ensuring that no faulty product
reaches the customers.
 To coordinate the functions of quality control, production, procurement &
other departments of the organization.
 To judge the possibility of making some of the defective parts acceptable
after minor repairs.
3. Fundamental Concepts

1. Measuring Range
2. Sensitivity: Ratio of scale spacing and scale division.

1
=100=𝑠𝑒𝑛𝑠𝑖𝑡𝑖𝑣𝑖𝑡𝑦
0.01

Now if measured quantity


changes by 0.005 units then
there will be how much mm of
displacement occur on dial
indicator?????

• Hence, sensitivity is rate of displacement of indicating device of an instrument


with respect to measured quantity.

• It is amplification factor or gearing ratio.


3. Fundamental Concepts

https://www.youtube.com/watch?v=aUQyMTUAMos
3. Calibration
• Effect of constant use of instruments on accuracy and hence needed
calibration.
• So, it is the framing the scale of the instrument by applying some
standardized signals.
• Schedule of calibration depends upon
 Severity of use
 Environmental conditions
 Accuracy of measurement
3. Fundamental Concepts

4. Repeatability
It is the ability of measuring instrument to repeat the same results
for the measurements, for the same quantity, when the
measurements are carried out.
• By same observer
• with same instrument
• under same condition
• without any change in location
• without change in method of measurement
3. Fundamental Concepts

5. Reproducibility
• Reproducibility is closeness of agreement between results of
measurement of the same quantity,
• By different observers
• by different methods
• using different instruments
• under different conditions, locations, time etc.
6. Accuracy and Precision of measuring instruments

 What is Accuracy?

 What is precision?

 Are they same? or is there any difference?


Can you hit the bull's-eye?

Three targets
with three
arrows each to
shoot.

How do they Both Precise Neither


compare? accurate but not accurate
and accurate nor
precise precise
Accuracy
 Accuracy indicates how close a measurement is to the accepted value.

 Accuracy is the degree to which the measured value of the quality


characteristic agrees with the true value.

 For example, we'd expect a balance to read 100 grams if we placed a standard
100 g weight on the balance. If it does not, then the balance is inaccurate.
Precision

 Precision is the repeatability of the measuring process.

 Precision indicates how close together or how repeatable the results are.

 A precise measuring instrument will give very nearly the same result each
time it is used.

More precise Less precise


Trial # Mass (g) Trial # Mass (g)
1 100.00 1 100.10
2 100.01 2 100.00
3 99.99 3 99.88
4 99.99 4 100.02
Average 100.00 Average 100.00
Distinction between Precision and Accuracy
4. Measurement Errors

• The difference between actual size of workpiece and measured value


• What are the reasons for inaccuracy in measurement?
1. Due to instrument error (calibration error, hysteresis, backlash,
misalignment of instrument)
2. Due to environmental effects (vibration, temperature)
3. Human error (feel, reading error, fatigue)
4. Bad assembly of instrument
5. Imperfect datum

• Measurement errors are of two types controllable and uncontrollable


** Controllable errors:
1. Environmental Error:
• These are due to effect of surrounding temperature, pressure and
humidity on the measuring system.
• Vibration and shock, electric field may also lead to such error.
• Internationally it is accepted that standard temperature of 20 degrees
Celsius need to be maintained during measurement. (for critical)
• Error=
• Improper datum surface, presence of dirt, chips produces error.
2. Elastic deflection:
• Stylus pressure- too large pressure applied on the
workpiece causes stylus pressure.
• This can be controlled by using ratchet mechanism
or dial indicator.
• Adjust stylus pressure depending on workpiece Pressure=force/area,
If area reduces ???
3. Error due to deflection:
• Sometimes work pieces are very long and we need to put it on supports in
such cases we have to use airy points. (d=0.577 L).
• Due to support ends remain parallel.
4. Alignment error:
• Cosine error- the error generated when the steel rule and desired dimensions
are not aligned.
4. Parallax error:
• The error that occurs when, the pointer on a scale is not observed along a line
normal to the scale.
• Remedy- reduce the distance between pointer and scale.

5. Error due to improper instrument


selection.

6. Error due to wear.


• Non controllable errors:
1. Scale error
2. Reading error
3. Hysteresis error: the difference in position between loading and unloading
curves.
• symptoms: the pointer does not return to zero when the load has been
removed
• Causes: the presence of dry friction, the properties of elastic elements &
the presence of internal stresses.
• Remedy: can be reduced considerably by proper heat treatment and
stabilization
5. Standards of Measurement

• A standard is defined as “something that is set up and established by an


authority as rule of the measure of quantity, weight, extent, value or quality” etc.
• The accurate measurement is made by comparison with standard of known
dimension, such a standard is known as primary standard.
• The first accurate standard was made in England in 1855.
• It was known as imperial standard yard this was followed by international
prototype meter made in France in the year 1872.
• These 2 standards of length were made of material they are often called as
material standards.
Imperial Standard Yard

• Material –Bronze bar (82% Cu, 13%


tin, 5%Zn) of 1 in *1 in cross section
and 38 in long.
• Yard defined as distance between
central transverse lines on gold plug
measured at 62º F. (16.67 ºC)
• Also, American standard yard was
defined same as imperial but at 68º
F.
International Prototype meter

• The length of the metre is


defined as the straight line
distance at 0º between the
centre positions of pure
platinum iridium alloy with 10%
iridium of 102 centimetre total
length and having cross section
shown in the figure.
Lightwave (optical) length standard

• One metre is equal to 1650763 .73 wavelengths of the red orange radiation of
Krypton isotope 86 gas.

You might also like