Professional Documents
Culture Documents
1.1 OBJECTIVES
1.) serve as an introduction to the basic measurement techniques
2.) acquaint the student in the use of common measuring instruments including micrometers, dial
calipers, and dial-gages
1.3 BACKGROUND
1.3.1 MICROMETERS
Micrometers are frequently used to measure dimensions such as the diameter and length of rods,
thickness of shafts etc. They are available in sizes ranging from (0-1”), (1-2"), (2-3") and so on.
After choosing the appropriate micrometer for the measurement, the object to be measured is
placed between the measuring surfaces on the anvil and spindle. The spindle can be moved back
and forth by rotating the thimble. Once both measuring surfaces make contact with the object to
be measured, the dimension is read from the scale on the micrometer’s sleeve and thimble.
The screw threads on the concealed part of the spindle have a pitch of 40 threads per inch. That
means that one complete revolution of the spindle moves it longitudinally 1”/40 or twenty-five
thousandths of an inch (0.025”). The sleeve is marked with 40 lines to the inch, corresponding to
the number of threads on the spindle. When the micrometer is closed, the beveled edge of the
thimble coincides with the 0 mark on the sleeve. Open the micrometer by revolving the thimble
one full rotation, or until the 0 line on the thimble again coincides with the next vertical line on
the sleeve; the distance between the anvil and the spindle is now 0.025" or 1/40 of an inch, and
the beveled edge of the thimble will coincide with the second vertical line on the sleeve. Each
vertical line on the sleeve indicates a distance of 0.025", and every fourth line equals 0.100". The
beveled edge of the thimble has twenty-five equal divisions and numbered 0-25. Rotating the
thimble from one of these marks to the next moves the spindle horizontally 1”/1000 or 0.001”.
Rotating it two marks is 0.002”, and so on. In this manner the micrometer can be used to
measure accurately to the nearest 0.001”.
1.3.1.2 HOW TO READ A MICROMETER
3.) Identify the number of vertical divisions (excluding the starting 0th division) visible on the
sleeve, between the 0th division and the spindle’s beveled edge. For example, in Figure 1.1
the number of vertical divisions showing on sleeve is 10.
4.) Identify the number of horizontal divisions on the thimble’s edge, from the 0th division to the
line that coincides with or is the closest to the horizontal line on the sleeve. (Again see Figure
1.1) The number of horizontal divisions on the thimble is 6.
5.) The micrometer reading (in thousandths of an inch) is:
[(# of divisions on the sleeve * 25) + (# of divisions on the thimble)]
So from Figure 1.1, the measurement is:
(10*25) + (6) = 256 thousandths of an inch or 0.256”
For another slightly different explanation of the divisions on a micrometer take a close look at
the micrometer located in Figure 1.2 on the next page. Any confusion about reading a
micrometer should be cleared up by this figure. Any further questions about the micrometer
should be directed toward the lab TA.
Dial calipers are also frequently used to make measurements of the outer and inner diameters of
tubes, thicknesses of sheets, depths of holes, etc. Typical calipers are available in sizes of 0-6
inches up to 0-24 inches. The original Vernier calipers had two scales, the primary scale and the
vernier scale. The final readings on the Vernier calipers were obtained by following a similar
procedure to measuring with micrometers. The calipers you will be using in the lab are known as
dial calipers. The object to be measured is positioned between the measuring surfaces and the
thumb wheel is rolled until contact is made with the object and the two measuring surfaces. The
depth probe at the back of the dial caliper can be used to measure the depth of a hole. This is
achieved by positioning the stationary edge of the main scale against the upper edge of the hole
and moving the depth probe downward until it touches the bottom of the hole.
The primary scale is numbered with a series of large and small numbers. Each large number
indicates the number of inches being measured. Each small number indicates 1/10 of an inch
(0.10"). On the dial scale each number indicates 1/1000 of an inch (0.001").
1.) Identify the large number on the main scale that is closest to and left of the edge of the dial.
2.) Identify the small number on the main scale closest to and left of the edge of the dial.
3.) Identify the number on the dial, it will be indicated by the needle.
4.) The caliper reading (in inches) = (large # on main scale * 1”) + (small # on main scale *
0.1”) + (# on dial * 0.001”)
Dial indicators, as shown in Figure 1.4, are popular comparative measuring tools. They measure
variations in attributes such as flatness, concentricity, etc. The probe at the bottom of the dial
indicator is positioned to contact the surface of a specimen, and enough pressure is put on the
probe to move it approximately halfway through its range of travel. The dial is adjusted to read
zero. A change in specimen size alters the position of the probe, which moves the dial pointer
through a rack and pinion mechanism. The graduations on the dial face are calibrated for direct
reading of variation from the nominal dimension.
Dial indicators may also be used to make direct measurements. Two styles of indicators are
commonly available:
Continuous Reading Indicators- The indicators we use today are equipped with a stem
that moves up or down as the contact point touches the specimen showing a direct
reading on a calibrated dial scale. Graduations may be 0.001", 0.0005", and 0.0001".
Dials for metric dimensions are available in 0.010 mm, 0.005 mm, 0.002 mm, and
0.001mm.
Universal Dial Test Indicators- The contact point moves from side to side rather than up
or down they are used mainly for making setups on machines or with height gages.
1.3.4 INCREASING IMPORTANCE OF THE ORGANIZATION OF
INTERNATIONAL STANDARDS (I.S.O./ METRIC SYSTEM)
IN UNITED STATES MANUFACTURING
With many American companies now doing business overseas, the I.S.O. (Metric) system has
started to gain increased importance in U.S. Manufacturing, over the traditional U. S. Customary
Standards (English) system. For this reason it is important to know how to convert basic
measurements from one system to the other.
The basic conversion factor is, one inch (1.000") equals 25.4 mm. By using the conversion
factor 25.4 times the number of inches [(# of mm) = (25.4 mm/inch) * (# inches)], the required
value in millimeters can be determined.
Also using a similar principle, metric measurements can be converted to English measurements.
The basic conversion factor is (1/25.4 inches/mm or 0.0394 inches/mm).
As a first step towards adapting the metric system many American shops use dual dimensioning.
The metric measurements are shown above the English dimensions, with a line drawn between
them.
Measuring devices that have a scale of some sort and that yield a reading have a range of
measurement. Within that measuring range, they can distinguish between different
measurements to some limiting resolution. The resolution of the instrument, that is to say, the
smallest change in a measurement that it can observe, is an indication of the instrument’s
precision.
However, some measuring devices are only intended to make a relative comparison. All they are
able to tell us about a measurement is whether it is larger, smaller, or indistinguishable from
some standard. One of the most common comparative measuring devices is called a gage block.
A gage block is a very precisely manufactured block of tool steel. The block is of a known size.
If we compare an object to that block, we can determine if it is larger, smaller, or about the same
as the block. What we cannot determine is the degree to which it is larger or smaller.
Sometimes we are unable to take measurements directly. Sometimes physical obstructions
prevent the measuring device from being used. For example, a standard micrometer is only able
to measure outside dimensions. Because of this limitation, it is unable to directly measure the
inside diameter, for instance, of a hole. To accomplish this, the size of the hole can be
determined by an indirect measurement. A hole gage is inserted into the hole and expanded to
the size of the hole. It is then locked in position. The gage is now the same size as the hole.
Once removed, the outside dimension of the gage, now the same size as the hole, can be
measured.
Name________________________
2.) Five complete revolutions of the thimble on a US customary micrometer will move the spindle axially a distance
of ____________________.
4.) A specimen that is found to have a hardness of 60 on the Rockwell C scale (60 HRC) will have approximately
what hardness on the Brinell scale? _________ HB.
ISEGR 3723
Lab 1 -Metrology - Data Sheet
1.) Each student should select a specimen and record the specimen number on their individual
data sheet.
2.) Measure and record the dimensions shown in the drawing using the specified measuring
instrument.
D
E
G
4"
0.3
1.) Obtain a specimen of steel and place it on the platform of the leveling screw (#3 in Figure
2.1). Note: The instructors will have already preset the tester for testing steel so please do
not change any of the settings on the machine.
2.) Rotate the black handwheel at the base of the leveling screw (#5 in Figure 2.1) clockwise
until the specimen just touches the point of the indentor (#2 in Figure 2.1).
3.) Now look at the digital display screen (#1 in Figure 2.1). Continue rotating the handwheel
until four dashes appear on the screen. Once the four dashes appear, continue turning the
handwheel until the display blanks out. At this point the tester is applying the necessary load
and is computing the value of the hardness. Note: Do not overturn the handwheel or the
computer will give you an error and it will not be able to give you a reading.
4.) After the load is applied, it will automatically be removed and the test result will be
displayed. Record this value on your data sheet.
5.) Rotate the handwheel counterclockwise until it clears the indentor and remove the specimen.
If you look at the specimen you will notice an indication the size of a pinhead where the
indentor entered the specimen.
Following the Lab Report Guidelines included in Appendix A of this manual, Each student must
individually prepare and submit a report. In particular, your lab report should include the
following:
• Tabular summary of measurements for specimens in Part I. (Include data and specimen
numbers for all samples measured.)
• Tabular summary of data from Part II.
1.) Explain the difference between direct and indirect reading linear measurements.
2.) Explain what is meant by comparative length measurement.
3.) Why is control of dimensional tolerance important?
4.) Why is the abrasive wear resistance of a material a function of its hardness?
5.) Make a sketch of the graduated scales of a micrometer with a reading of 0.738".