You are on page 1of 10

SECTION 1 INTRODUCTION AND DESCRIPTION Vernier The vernier, dial, and digital calipers give a direct reading of the

distance measured to high accuracy. They are functionally identical, with different ways of reading the result. These calipers comprise a calibrated scale with a fixed jaw, and another jaw, with a pointer, that slides along the scale. The distance between the jaws is then read in different ways for the three types. The simplest method is to read the position of the pointer directly on the scale. When the pointer is between two markings, the user can mentally interpolate to improve the precision of the reading. This would be a simple calibrated caliper; but the addition of a vernier scale allows more accurate interpolation, and is the universal practice; this is the vernier caliper. Vernier, dial, and digital calipers can measure internal dimensions (using the uppermost jaws in the picture at right), external dimensions using the pictured lower jaws, and in many cases depth by the use of a probe that is attached to the movable head and slides along the centre of the body. This probe is slender and can get into deep grooves that may prove difficult for other measuring tools. The vernier scales may include metric measurements on the lower part of the scale and inch measurements on the upper, or vice versa, in countries that use inches. Vernier calipers commonly used in industry provide a precision to 0.01 mm (10 micrometres), or one thousandth of an inch. They are available in sizes that can measure up to 1,829 mm (72 in).



Ensure that the calipers jaws slide smoothly and freely along the full length of the TI beam, if not take corrective action. Inspect to ensure the jaws are free of nicks and burrs and that it is clean and free from damage that would impair its operation. NOTE TI test point are basically determined by selecting 4 test point within the first inch of range and 4 to 8 point additional test points extending over the remainder of the TI range at approximately equal spacing.



In order top minimize the number of gage blocks needed to test calipers with higher ranges, select test point at 25, 50, 75, and 100% of the TI range beyond the 1st inch and round to the nearest inch. Use major vernier graduations or normal electronic digital values as test points as necessary.


PRELIMINARY OPERATIONS 3.1 3.2 TI INSPECTION Ensure that the work area is clean, well illuminated, free from excessive drafts, free from excessive humidity, and that the rate of temperature change does not exceed 4 oF or 15.56oC per hour. Ensure that the gage block set (figure 2.1) is clean and that the TI and the gage blocks have been allowed to stabilize at the ambient temperature for a minimum of 2 hours.


Figure 2.1



This procedure describes the calibration of Internal and External Dial, Vernier and Digital Calipers and Outside Micrometers. The instrument being calibrated is referred to herein as the TI (Test Instrument) This procedure includes test and essential performance parameters only. Any malfunction noticed during calibration, whether specifically tested for or not, should be corrected. Table 1 Calibration Description TI TI Characteristics Zero indication test Performance Specification Test point 0 in. Tolerance +/- .0005 in. Range 0 to 12 Tolerance: see Table II Range 0 to 12 Tolerance: see Table II Test Method Determined by sliding the jaws together and reading the TI indication. Comparison to gage blocks between the TI jaws. Comparison to gage blocks dimensions placing the blocks with attached caliper jaws outside the TI jaws. Comparison to gage block dimensions, placing the TI depth gage block and reading the TI outside dimension scale. Measured by comparing TI indications to gage block dimensions setup to test the basic length and micrometer head linearity.


Dial, Vernier and Digital Calipers

Outside accuracy Inside accuracy

Depth gage accuracy

Range 0 to 12

Outside Micrometers

Length and Linearity

Range: 0 to 12


EQUIPMENT REQUIREMENTS Table 2 Equipment Requirements Item 2.1 Gage Block Set 2.2 Low power magnifier 2.3 Micrometer wrench 2.4 Small jewelers screwdriver 2.5 Small clamp 2.6 Light machine oil or spray 2.7 Clean flat surface Minimum Use Specifications Range .050- 4 Tolerance .00005 To aid in reading the TL vernier or barrel scale Adjustment of micrometer Adjustment of caliper Hold gage blocks for inside measurement comparison Lubrication of slides or thimble Surface Plate or Steel Block Calibration Equipment ESSM #SE0060

SECTION 2 5|Page

CALIBRATION PROCESS NOTES Unless otherwise specified, verify the results of each test and take corrective action whenever the test requirement is not being met before proceeding. Cotton gloves should be worn when handling gage blocks to prevent the transfer of body heat, protect the gage surfaces. 2.1 2.1.1 ZERO TEST Slide the TI jaws together, ensuring that no light is visible between the jaws measuring surfaces. If the TI has inside measurement capability, verify that the dial indicator or vernier indicates zero, as necessary. Adjust the dial bezel, if necessary. Tighten the TI sliding jaw set screw, if applicable If the TI has a digital readout, depress the zero set and verify that the digital indication reads 0.000. If the TI is a vernier type, verify that the TI zero marks are aligned, as applicable. OUTSIDE ACCURACY TEST Determine the gage blocks required to obtain test points at a minimum of 4 point throughout the first inch of the TI as follows .125, .300, .650, and 1.000 inch.

2.1.2 2.1.3

2.1.4 2.2 2.2.1

2.2.2. Open the TI to beyond the first test point. Insert the gage blocks and close the jaws until they are firmly in contact with the gage block. Repeat each reading 3-5 times and verify that each indication does not exceed +/- of the least dial graduation. 2.2.3 Verify that the first inch test points are within +/- .001 inch for 0-6 range and +/.002 for 0-12calipers. Repeat steps 4.2.2 and 4.2.3 for the remaining test points of 25%, 50%, 75% and full range (100%). INSIDE ACCURACY TEST Determine the gage blocks required to obtain test points at a minimum of 4 point throughout the first inch of the TI as follows .125, .300, .650, and 1.000 inch. 6|Page


2.3 2.3.1

2.3.2. Using a three gage block set-up. Place the test gage block between two other blocks and secure with a thumb screw type clamp (Figure 1). Open the TI to the approximately the first test point. Insert the TI and open the jaws until they are firmly in contact with the inside of the gage blocks. Repeat each reading 3-5 times and verify that each indication does not exceed +/- of the least dial graduation. 2.3.3 Verify that the first inch test points are within +/- .001 inch for 0-6 range and +/.002 for 0-12calipers. Repeat steps 4.2.2 and 4.2.3 for the remaining test points of 25%, 50%, 75% and full range (100%).
Test Point G age Block G age Block G age Block



Figure 1 Inside Measurement Set-up

2.4 2.4.1

DEPTH ACCURACY Position the end of the TI beam against a 1.0 inch gage block surface with the end of the rod against the surface plate or steel gage block. Ensure that the TI gage measuring surfaces are squarely placed against the surface of the gage block and the surface plate. Make any necessary final adjustments and note the scale indication. Verify that the value(s) noted in the preceding step is within +/- .001 for 0-6 range and +/- .002 for 0-12calipers. Perform steps 4.4.1 through 4.4.4 for each additional inch of depth gage range, changing gage blocks, as necessary.


2.4.3 2.4.4




Complete a Measuring and Test Calibration Record (QA-4) for each item calibrated. Utilize additional QA-4 form as continuation sheet as needed. Affix a calibration label to the TI with the date calibrated, date of next calibration and who performed the calibration. 3.2.1 Special calibration label shall be applied to any item that has not been calibrated to its fully range of capability.



All completed calibration records shall be file and maintained.

MICROMETER CALIBRATION Aim: - To study various types of micrometers. - To calibrate the given micrometers, using slip gauge as standard. - To study use of combination set. Apparatus: - Set of Micrometers - Set of Slip gauges - Combination set Procedure: For calibration of micrometers 1. Check the range of measurement of the micrometer. 2. Note down zero error of the micrometer, if any. 3. Select a number of slip gauge combinations. 4. Measure each slip gauge combination with the micrometer and note down micrometer reading (M) and slip gauge combination length (G) in tabular form. 5. Plot a calibration chart for the micrometer taking M on the X-axis and (M-G) on the Y-axis. 6. Repeat the steps l-5 for other micrometers. Precautions: ! While making slip gauge combination, do the wringing correctly, so that no foreign particles are entrapped. ! While taking micrometer reading, care should be taken to clamp the spindle in position, before taking it away from the block, as due to friction the Spindle will rotate and give a wrong reading. Turn the spindle always in the clockwise direction to avoid backlash error.


Calibration testing reduces the number of test levels needed to identify the scale of a performance tests. The most important aspect of calibration is to have the same reference when refering to something. You can have a block of metal that is exactally one inch long on a side and ask multiple places to measure it and depending on the reference you can get totally different answers. This becomes even more important when talking about a global reference. If you have an item that uses AAA batteries that is made in China you want to know that batteries made in Taiwan are going to fit. The only way this is possible is throught calibration. Engineering a product means absolutely nothing if you cannot build it because the tools used in the production of the item cannot accurately produce the item from worker to worker. Again it is all about having the same reference.

9|Page language=1&limit=10&num_pages=&page=4&query=calibration he_basics_and_importance_of_calibration.html

10 | P a g e