You are on page 1of 49

Visu 3100

Measurement – use of basic dimensional


measuring instruments
Visual 3100 – MEASUREMENT

Metrology: is the science of measurement

Measurement: is the language of science – a way to communicate size,


quantity, position, condition…for us the branch of applied geometry that
is concerned with length of lines, areas of surfaces and volumes of
solids-pertaining to lines and angles.
Visual 3100 – MEASUREMENT

3 reasons why we need measurement:


-to make things
-to control the way other people make things
-for scientific description –to provide definite information
So the importance is defining and verifying what is
required…measuring as Quality function-inspection

SI system= International system of units. Has seven base units (Length - meter (m),
Time - second (s), Amount of substance - mole (mole), Electric current - ampere (A), Temperature - kelvin
(K), Luminous intensity - candela (cd) and Mass - kilogram (kg) ) plus two supplementary units (plane
angle and solid angle).
Meter defined in 1983 by international committee on weights and measures
“As the length of the path traveled by light in a vacuum in 1/299,792,458 of a
second. This definition also locked the speed of light at 299,792,458 meters
per second in a vacuum.” NIST
Interested in this…take a look at https://www.nist.gov/si-redefinition/meter
Visual 3100 – MEASUREMENT

Actual size is the measured size.

Basic or Nominal size: Design size-dimension not actual. Basic/nominal dimension is the size in
relation to which all limits of size are fixed, and will be the same for both the male and female parts
of the fit. Nominal size is a general size for classification/identification of a size.
Limits of size: These are the maximum and minimum permissible sizes acceptable for a specific
dimension.
For clarification for how sizes/dimensions may be named- written:
nominal size: The size designation used for general identification. The nominal size of a shaft and a hole are the same. This
value is often expressed as a fraction.
basic size/dimension: The exact theoretical size of a part. This is the value from which limit dimensions are computed. Basic size
is a four decimal place equivalent to the nominal size. The number of significant digits imply the accuracy of the dimension.
example: nominal size = 1 1/4 basic size = 1.2500

Tolerance: This is the total permissible variation in the size of a dimension, and is the difference
between the upper and lower acceptable dimensions. Tolerance = max size – min size = Allowance
around a dimension.
Allowance concerns mating parts (fit and clearance), and is the difference between for example the
high limit of size of the shaft and the low limit of size of its mating hole. An allowance may be
positive or negative.
Visual 3100 – MEASUREMENT

Gauging and Tolerances

• Why do we have this? We Want to show the relationship between 1. Engineering design, 2. Product, 3.Part inspection-
Actual

Engineering design is what is to be manufactured. The product is what has been manufactured. Part inspection is how we
compare the product with the engineering design
• The common factor that binds these three areas together is geometric dimensioning and tolerancing

Tolerance

• It is possible to measure features of a part to such precision that no two would be identical
• For their intended purposes, all parts can vary within limits and still be adequate
• Too little precision produces bad products, Too much precision prices the goods out of the marketplace
• Either of the above is made worse by the number of individuals involved and especially by their separation
• An unambiguous expression of “good enough” is needed, this is tolerance

Geometric Dimensioning and Tolerance (GD&T)

• GDT is a means of dimensioning and tolerancing a drawing with respect to the actual function or relationship with part
features or what is needed per further assembly or processes. Tolerances of size and location do not provide the
necessary control of a feature necessary for production of the final product in a more controlled manner, universally
applied which will provide for universal fit of final manufactured product.
Visual 3100 – MEASUREMENT

Geometric Dimensioning
This type of dimensioning should be used when:
– Features are critical to functionality or interchange ability of part
– When datum references are desireable to ensure consistency between design, manufacturing and inspection
– When cad/cam being used
– When standard interpretation or tolerance not already implied

Maximum Material Condition Principle (MMC)


• Two conditions necessary: 1. Two or more features are interrelated with respect to location or orientation. At least one related feature is to be a
feature of use, 2. The feature(s) to which the MMC principle is to apply must be a feature of size with an axis or centre plane.
Least Material Conditon Principle (LMC): condition in which a feature of size contains the least amount of material within the stated limits of size

Five Types of Geometric Characteristics


• Form Tolerance-how far an actual surface or feature can vary from the drawing
• Orientation Tolerance-how far an actual surface or feature may vary relative to a datum or data
• Profile Tolerance-how far an actual surface or feature can vary from the desired form implied by the drawing and/or vary relative to a datum or data
• Runout Tolerance-how far an actual surface or feature can vary from the desired form implied by the drawing during a full 360 rotation of the part
on the datum axis
• Location Tolerance-how far an actual size feature is permitted to vary from the perfect location implied by the drawing as related to a datum or data
or other feature

Tolerancing
• Tolerances of form, orientation, profile and runout state how far actual surfaces or features are permitted to vary from those implied by the drawing
• Expression of form and orientation tolerances refer to flatness, straightness, circularity, cylindricity, parallelism, angularity etc.
• The profile of a surface, line, circular runout, and total runout tolerances are unique variations and combinations of form, orientation, and
sometimes location and are considered as separate type of characteristics
• Form, orientation, profile or runout tolerances should be specified when established workshop practices cannot be relied upon to provide the
required accuracy and/or a document establishing standards of workmanship cannot be prescribed.
Visual 3100 – MEASUREMENT
Position: How far your features location can
vary from its “True Position” – related to a
datum/controlling feature.

Concentricity: How far your cylindrical feature


may vary from a central datum

Symmetry: used to ensure that two features on


a part are uniform across a datum plane. (Non
circular concentricity)

Profile of a surface describes a


3-Dimensional tolerance zone around a
surface, usually which is an advanced curve
or shape.
Visual 3100 – MEASUREMENT

Fcf-feature
control frame
Visual 3100 – MEASUREMENT

Geometric Dimensioning & Tolerancing – Feature


Control Frames

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100 – MEASUREMENT

Geometric Dimensioning & Tolerancing – Feature


Control Frames

(Dotson, Fundamentals of Dimensional Measurement)


Visual testing 3100
Measurement is used to determine an objects dimension by comparison to a standard (indirectly or directly).
Standard Units of measurement: S.I., Metric units in mm, cm, m, um
US customary system: inches, feet, yards.

• Tolerance on a dimension is the difference between the highest and lowest value that a dimension is allowed to
have. Being outside the limits is unacceptable. Gauges such as go/no go gauges are set to these limits. Example
of a tolerance would be +1mm, -2mm or +/-.

We use measuring instruments to determine measurements and compare actual object dimension to the design
(engineering drawing requirements).
• Types of measuring tools: Square, Combination square(good for flatness, squareness, angle preps and
measurement), straight edge (checking flatness), long steel tape measures (with tension and clamp) for long
measurements and scaled instruments (precision instruments like Vernier caliper, micrometer, indicators etc.).
When using a square the gap from the surface can be measured using rulers, gauge blocks, feeler gauges etc..
Taper gauge (taper with scale) to measure gaps. Other instruments such as Depth gauge for measuring depth of
grinding or UT thickness meter/or the surface profile meter used in the previous lab.
• Protractor on ruler essential to check bevel preparation angle for groove welds. A small bevel error results in
large effect on actual weld size.
• Note you can always scribe lines /string lines across areas to take measurements at the same axis or point.
Visual testing 3100
Visual 3100

Vernier Instruments: Read in increments: Whole division left of zero, largest


number on main scale to right of index (0), largest whole minor division to right in
graduations/increments per scale i.e. 0 .000to0.025, then find exact graduation that
most exactly coincides with any graduation on scale (step 4 lined up exactly).

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Inch Metric

Vernier Calipers have several different scale graduation possibilities. It is important to be


familiar with the graduation values before attempting to read the result of your measurement.
Inch= 1, 100mil (100thou-.1),25mil (0.025thou), nearest mil(0.001 thousandth).
Metric= main bar every graticule is 1.0mm, every 10 graduation is 10mm (1cm), 20mm (2cm)
etc. Vernier plate(scale) divided into 50 graduations-each representing 0.02mm—read to
hundredths of millimeter xx.xx.
Visual 3100

Graduated Angular measurement instruments (manual scaled)


Visual 3100
28 degrees
15 minutes
Visual 3100

Vernier Caliper

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Vernier Caliper – Alignment Issues

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Vernier Caliper –
Alignment Issues

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Vernier Caliper
Outside diameter measurements

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Vernier Caliper – Physical Issues


Overtightening

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Vernier Caliper – Physical Issues


Jaw Wear

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Vernier Caliper – Physical Issues


Mis-Alignment of
Part to caliper in measuring

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Micrometer Instruments – Outside Micrometer


Visual 3100

Micrometers are designated by the size of their largest opening, and move up in steps of
one inch for Imperial and by 25mm for metric.

Imperial: 1” mic reads from 0 to 1” only


2” mic reads from 1” to 2” only
3” mic reads from 2” to 3” only, etc.

Metric: 25mm mic reads from 0 to 25mm only


50mm mic reads from 25mm to 50mm only
75mm mic reads from 50mm to 75mm only, etc.
Visual 3100

Micrometer in inch. Notice divisions, sleeve scale 0-.1”, with 3 lines being .025, .05 and ,075”
Live movement of screw thread- micrometer and
Vernier scales- graduations. Inch-1, .1, .025 and .001
Visual 3100

Reading 13.36mm example on


micrometer.
Visual 3100

Micrometer Instruments – Ratchet Stop

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Micrometer Instruments

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Gauge Block Sets

These blocks are the fundamental to


dimensional quality control. Major
characteristics for use include excellent Surface topography very
accuracy, wear resistance, surface important- roughness,
finish, dimensional stability or thermal waviness, flaws, lay.
conductivity. This forms the basis of
the grade of block. For surface finish old
Surface finish of the gauge block method RMS (root mean
important. Can have steel, carbide, square), now arithmic
chromium plated, ceramic, stainless mean or average of
steel. Carbide blocks can damage other distance between the
blocks and expansion coefficient mean line and the crests
different than of expansion than most or peaks.
materials.
Visual 3100

Gauge Block Sets

Gauge block is so useful because they provide


discriminations in 1um or 100u” (providing access to
accuracy of 0.03um or 10uinch).
The error/uncertainty of block measurement is known.
Can generate 10,000 different sizes (lengths).
Good for 10 to 1 rule (i.e./ISO16949 QMS for
automotive supply chain) applied to inspection of
dimensions and in setting and calibrating other
instruments.
10:1 rule – 1 tenth of tolerance. Required to use a
gauge that has resolution of 1/10th of the total
tolerance of a dimension being measured. For
example +/- .04mm = total tolerance of .08mm,
therefore need gauge where you can discriminate
.008mm
Visual 3100

Gauge Block Sets

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Use and characteristics of Gauge Blocks:

Gauge blocks are used in dimensioning by building a smaller stack then


the opening (dimension to be measured) and adding smaller or larger
blocks to narrow the range to determine which blocks (length standard
ends to add) to stack up to dimension opening. The gauge block stack
should be tight enough so no wobble and can still slide through opening.

Procedure in building a stack for a set dimension:


Eliminate each decimal place dimension from smallest increment first (top
down), selecting a block which contains this increment. Build with least
amount of blocks. Why least amount of blocks because final stack up
dimension requires each block to be successfully wrung together
(adhering with closeness-minimal space between blocks). Refer to your
combining gauge block handout.
Visual 3100

Combining Gauge Blocks.


1.5638
Can use gauge block holder.
Note no wear block of. 0.100” in example.
Combine by: First block: .1008
Wringing together (…but first inspect and 1.5630
clean (any dirt or wear will cause damage
Second block: .133
and introduce error. Burrs can be detected
1.4300
with optical flats) so have Wrung blocks Third block: .130
adding up and filling dimension of 1.3000
measurement. Tight contact-minimal gap Fourth Block: .300
and blocks “bond”. 1.0000
Can check how well blocks are wrung by
using Amplified Comparator (i.e./electrical in Fifth block 1.000
your lab) and comparing multiple blocks to
one length gauge block.
0
Visual 3100

5 Principle uses for Gauge Blocks:


1. Calibration of other instruments
2. Setting of comparators and indicator
type instruments
3. Attribute gauging
4. Machine setup and precision assembly
5. Layout

The place to carefully use gauge blocks is Can use blocks in stands, add
when precision required increases, length end standards (unique
increases, importance of reliability and purpose blocks for the ends
accuracy increases and skill of measurer of a setup with for example
decreases (however need ability and profiles (shapes-radius),
knowledge in handling and wringing blocks caliper bars, or scribers.
properly is required).
Comparison method of measurement.
Visual Testing 3100

Use of Dial indicator: measures change across length. Measurement by comparison. A mechanical
comparator. Does not have same discrimination, reliability of Electronic instruments.

DEFINITIONS- Discrimination: What we distinguish as each division-what the instrument has clearly
marked as a value.
Resolution definition: what division discrimination is recognized (smallest readable graduation on scale).
Amplification of pointer through gearing- spindle displacement (by rotation of gear on pitch diameter
points) by lever of indicator. Gear driven 10:1 (2 gear system), 100:1 (3 gear system).
The smallest readable division is printed on the dial, as we find with many of our measuring
instruments.

The dial face can be rotated to set your standard as zero (setting reading to zero). Prior to comparison.
If the dial is fixed remove the setting reading from the part reading difference on the dial.

+/- from 0 on indicator means watch whether you are going up or down in comparison to your standard
or zero setting. On indicators with revolution counters (the small circular scale within the dial face),
after the indicator hand/pointer has moved a full revolution around the dial this revolution is indicated
on the revolution counter. Meaning, what is read from the dial is that measurement difference plus
what ever the full revolution (or revolution) measurement values on the small counter.
Visual testing 3100

(Dotson, Fundamentals of Dimensional Measurement)


Visual testing 3100

Only use balanced


indicators in the
range indicated on
the dial, not
beyond the half
dial revolution.
Balanced indicators
have scale 0-50-0
while continuous
dial indicators are
0-90, for example.

(Dotson, Fundamentals of Dimensional Measurement)


Visual 3100

Review Methods of measurement:


1. Direct method. compare the quantity directly with the primary or secondary
standard. Example Micrometer, ruler, Vernier caliper, gage block stack

2. Indirect method. Comparison method. i.e./transfer gauges (caliper,


telescoping, small hole gauges etc.) and dial indicators. Using of
another standard (comparison) to determine dimension/ a value.
Telescoping gages usually used to make inside hole dimensions using a
micrometer to measure telescoped diameter after set when inserted into
hole.
Visual testing 3100

Types of error to eliminate/minimize:


• random and systematic (systemic) error. Random is the general scatter of a value amongst the same
measurable. Systematic error is a very serious error which can be repeated with every measurement.
It is a shift of all measurements consistently off the “true value”. I.e./incorrect template used.
Incorrect interpretation or use of specification. An instrument not calibrated correctly to a true value
reference.
• Parallax error –is reading an instrument result on a angle which causes distortion. i.e./think of a scale
slightly away from what is being measured/read –your eye can line up the lines/arrows/measurement
incorrectly. (see next slide)
• Cumulative error. Error accumulated from adding up multiple measurements which utilizes multiple
points. Try to measure from a datum- From a common reference point out. This avoids accumulating
error from each measurement to the next (i.e./ poor dimensioning of hole locations 1 to 2, 2 to 3, 3 to
4 etc. vs reference to 1, to 2, to 3 etc.).
Visual 3100

Measuring instrument errors to watch for:


C: Alignment errors:
A Cosine error: Happens if measuring instrument is not aligned with the
work piece. Correction is cosine of the angle A which then is multiplied
against your indicated measurement. Example 30 deg. angle A indicator
stylus to surface = .86. Therefore a reading of .018” on indicator is then
multiplied x 0.86=.01548” to give the corrected measurement. Meaning
measuring height/thickness of 0.015” feeler gauge with dial indicator with
spindle at 30 degree agree as below would read incorrectly at 0.018”..

B: Elastic Deformation: Over


tightening instrument on part:
-can deflect gage (cause bowing- Sine Error-Abbe’s error – not measuring along axis or correct alignment
instrument reads larger) or cause of dimension, as dimensioned. Example: Vernier caliper scale not inline
deformation of gage to part surface with dimension alignment on part measured (See previous slide)
(instrument reads smaller).
Visual 3100

Calibration and Traceability


What is Calibration?

Calibration determines the accuracy relationship, of a gauge or


an instrument to a higher standard. For example a gauge
block.
Calibration is used to ensure traceability back to a
standard; each calibration rests upon the calibration of a
higher standard until the International Standard is reached.

Calibration results provide understanding of instruments


stability in addition to accuracy in measurement: The
maximum drift in readings/results over time.

What is Traceability?
Traceability is the authentication of the accuracy of lineage of
any measurement back to the absolute standard
Visual 3100

Calibration and Traceability


What is a Standard?
A standard is an agreed on unit for use in measurement.
The standard is the ultimate embodiment of the unit.
A standard is a copy that is traceable back to the international standard.

Precision: the repeatability of a measuring process; it is possible to be accurate but not precise.

Measurement Uncertainty: All measurements are subject to uncertainty and a measurement result is complete only when it is
accompanied by a statement of the associated uncertainty.

Accuracy: the deviation of a measured value from the true value being measured; expected uncertainty of the reading relative to
the standard.

Repeatability: maximum variation among several measurements taken with one instrument on one part feature.

Discrimination: the degree to which an instrument subdivides the unit of length being used for measurement.

Resolution: smallest difference between readings that can consistently be detected.


Visual 3100- Measurement

Standards, traceability and accuracy


Accuracy is derived from the Standard, flowing backwards to the
last point of measurement. If the link (traceability) is broken,
accuracy is lost (uncertainty not known), that is why there is a need
for traceability.
I.e./ do not know the measurement uncertainty – (determination of
errors not complete) and therefore accuracy, where the traceability
chain is broken.
The uncertainty gives a picture of the accuracy to be applied to the
measurement. How close can we be to the known true value with our
measurement (using a instrument and process).
For calibration (in general for shop/companies) ANSI Z540 or ISO 10012 is used, which utilizes
4:1 rule. Using a check standard that has an uncertainty less than or equal to 25% of the
tolerance of instrument/gauge. Be capable of measuring – with accuracy ¼ of tolerance
required.
Visual Testing 3100

When estimating measurement uncertainty for a measurement parameter, it is necessary


to take into consideration the error permitted by accuracy specification, precision and
resolution. ISO Guide 99 defines the terms as follows:
Accuracy: Closeness of agreement between a measured quantity value and a true quantity
value of a measurand (what is measured).
Accuracy defines the permissible measurement error from the nominal value. Under
normal circumstance, we take more than one measurement (five to 10 measurements are
ideal) to verify an instrument’s accuracy by comparing the average of those repeated
measurements from the nominal value.
Precision: Closeness of agreement between indications or measured quantity values
obtained by replicate measurements on the same or similar objects under specified
condition.
Precision is defined by the repeatability of the instrument and is normally expressed as a
standard deviation. This is also known as Type A data for measurement uncertainty
analysis.
Resolution: Smallest change in a quantity being measured that causes a perceptible
change in the corresponding indication.
Resolution can depend on, for example, noise (internal or external) or friction. It may also
depend on the value of a quantity being measured. Resolution of a displaying device is the
smallest difference between displayed indications that can be meaningfully distinguished
(the divisions you can read on a Vernier scale for example).
The terms accuracy and precision are sometimes interchanged incorrectly in product
specification literature and by the end user. The relationship between accuracy and
precision is illustrated in Figure 1.
Visual 3100- Measurement

The Rule of 10
“The discrimination of the instrument should divide the
tolerance into 10 parts”

In other words:
the gage or measuring instrument should be 10 times as accurate as
the characteristic to be measured.

Many believe that this only applies to the instruments used to calibrate a gage or
measuring instrument when in reality it applies to the choice of instrument for
any measuring activity. The whole idea here is to choose an instrument that is
capable of detecting the amount of variation present in a given characteristic to
discern if the measurement taken is within the tolerance required. True accept
decision.
ISO/TS 16949 applied to inspection in automotive supply chain requires this.
The 10:1 rule deals with precision of measuring instruments and the problem is
you can reach international standard quickly when applied to calibration chain.
Visual 3100- Calibration example

Example Calibration of Vernier caliper:


Test Characteristic: Inspect the inside and Outside Jaws, Depth Gauge and sliding jaw for smooth movement
Test Method: Visual, Touch
Acceptable Limit: No damage, nicks, or burrs. Should have straight and parallel faces with no free play over the whole
length.

Step 1. Zero the Vernier Caliper at the start and adjust as required by the manufacturers’ specifications. If you cannot zero it then
mark it as a fail.

Step 2. Next measure a known standard, gauge blocks (which have been calibrated and are traceable to higher standard –recall
calibration and tracability chain in accuracy and measurement uncertaintly). When testing the Vernier Caliper, one of the points
must be near the lower limit that the instrument can measure, another somewhere in the middle, and the third near the upper
limit. Use a conversion factor of 25.40 mm/in to convert gauge block lengths to Metric from Imperial.

Step 3: Test Characteristic: Outside Jaws Test Method: Using gauge blocks, measure and record different lengths Acceptable Limit:
.001” or .002” must be defined.

Step 4: Test Characteristic: Inside Jaws Test Method: Use a calibrated micrometer to measure gauge blocks of appropriate length,
lock the micrometer at that length then measure between the micrometer anvils using the inside jaw of the instrument and record
the data.
Step 5: Test Characteristic: Depth Gauge Test Method: Make 2 equal stacks of gauge blocks on the granite surface to a height in the
mid range of the instrument, measure and record the height with the rod between the two stacks and the flat part of the
instrument flush against the top of each of the two stacks

You might also like