As a critical measuring tool for accurate and precise measurement, a micrometer must have a periodic calibration. The calibration will assess the micrometer’s accuracy. If it is out of a certain tolerance, you may need to replace it with the new one.
It’s not only about testing the length measurement but also the flatness and parallelism of the anvil if it is a flat anvil model micrometer. If the anvil is not flat enough (out of tolerance), it can affect the measurement result.
Actually, the best practice for micrometer calibration is sending your mic to an ISO 17025 accredited calibration lab. This is to make sure your calibration is traceable and be confident with your micrometer accuracy. Depending on your accuracy requirement, if you need a highly trusted calibration such as for research, let them do it for you.
However, if you don’t need that level of work, maybe for a small project in your workshop, you can purchase your own workshop grade gauge block set, micrometer holder, and other equipment, then perform that calibration yourself. But make sure your gauge blocks should have been calibrated as well.
In this post, we are sorry to say that we only cover how to calibrate the micrometer that has a flat anvil. As said before, if you have a different anvil model, better to send them to a lab.
Why Do We Need to Calibrate A Micrometer?
Calibration of the micrometer is necessary to secure that the micrometer is still within the manufacturer’s specification and to assure that it gives an accurate measurement. Like other measuring instruments such as the caliper, a micrometer is also prone to wear and tear, resulting in inaccurate measurements due to long time use and mishandling. Always take note also that calibrating a micrometer does not mean any damage can be corrected. You should rely on an expert to repair any damages that happen to the micrometer.
How Often Do We Need to Calibrate A Micrometer?
Periodic calibration is recommended for every measuring tool. It goes the same with a micrometer which is advisable to calibrate a micrometer annually. However, it still depends on how often you use this measuring instrument and how you use it.
If you practice the periodic micrometer calibration, always compare the recent result with the previous ones. If there is a change, you can predict the time it is likely to wear. In case of the result of your periodic calibration is always good, you may adjust a longer calibration interval to save time and money.
Materials and Tools Required for Micrometer Calibration
The accuracy of the reference gauges/standards used for the calibration should be higher than the accuracy of the micrometer. Based on the test accuracy ratio (TAR), which evolved from the existing standards of U.S. Military Standard, MIL-STD-120, tells that the accuracy of the measurement standards, such as the reference gauges, measuring equipment, and tools, should not exceed the 25% of the tolerance of the measuring tool to be calibrated or the 4:1 ratio which is also the conformance decision rule based on ASME B89.1.13 Standard.
The following are some of the basic tools used for micrometer calibration.
- Gauge Block Sets. Select the gauge block sets whose range can cover the maximum length of the micrometer for calibration. Grade 0 gauge blocks are used for calibrating micrometers used for workshops. Select also the gauge block with the same unit system (mm or inch) as the micrometer to calibrate.
- Gauge Block Holder. For a long-range micrometer that needs long length gauge blocks, use a gauge block holder to properly stack the gauge block to the desired length.
- Micrometer Checker. Instead of gauge blocks, a micrometer checker is used for length measurement test errors.
- Micrometer Setting Standards. Micrometer standards are used for proper zero setting and zero setting tests of micrometers. It can also be used for linear measurement or measurement error tests.
- Optical Flats and Monochromatic light. Optical flat and monochromatic light is used for flatness checking of the measuring faces of outside and inside micrometer.
- Micrometer Stand. A micrometer stand is used to hold the micrometer for stable and accurate measurement.
- Surface Plates. Any measurement during the calibration should be done on a surface plate that serves as a horizontal reference plane.
- Thermometer and Thermo Hygrometer. Temperature and relative humidity should be maintained to 20°C/68°F and 30% to 40% respectively. Since it contributes to the errors during calibration, proper monitoring and recording should be done using a thermometer and Thermo hygrometer.
- Cleaning Materials. Use lint-free cloth and isopropanol.
- Gloves/finger-cots. Use lint-free gloves and finger cots.
- Worksheets/Check Sheets/Datasheets. To record the result of the test conducted during the calibration of the micrometer, a checksheet must be prepared to provide relevant reference data for interpreting the result of calibration and in order not to miss the important checkpoints during the calibration process.
Preliminary Inspection of the Micrometer and Reconditioning of the Micrometer
Same with the calibration of the caliper, the calibration of the micrometer also starts with an initial inspection of the measuring tool. By the preliminary inspection, minor defects and major defects can initially be detected. The minor defects can be corrected and the major defects need to be repaired by an accredited calibration laboratory. Preliminary inspection checkpoints may also be indicated on the checksheets as reference.
1. The parts of the micrometer must be complete.
2. There should not be any damage. Bent, broken, or hammered parts of the micrometer need repair service. Dents should be examined if it affects the accuracy of the micrometer, especially if it is on the anvil and spindle. Corrosion on the parts of the micrometer must also be controlled.
3. The scale and graduation must be visible. For a digital micrometer, make sure that the reading on the LCD is clear and visible. It is advisable to replace the digital micrometer with new batteries before the calibration.
4. For an outside micrometer, the movement of the spindle must be smooth by rotating the thimble. Confirm it by running the thimble of the micrometer starting from zero up to its maximum range and vice versa. There should not be a stuck or bump feeling during the run. Same with an inside micrometer, the movable jaw movement should also be smooth. Bumps indicate that there is a problem with the micrometer.
5. Check if the ratchet is functioning well.
To make sure that it works properly by moving the ratchet when the micrometer is locked. The improper functioning of the ratchet can also lead to inaccurate results.
If the ratchet doesn’t work properly then the spindle will not move properly in the desired direction. The ratchet also controls the speed at which the spindle rotates. If the spindle is difficult to turn, it will definitely not work properly. Repairment might be able to get rid of the malfunction. A ratchet that controls the speed properly and allows the spindle to move in the desired direction is needed for accurate results.
6. Initially, check the flatness and parallelism of the measuring faces of the micrometer through visual checking by closing the measuring faces. Place against the light. Any offset and gap between the measuring faces should be analyzed to check if it can still be adjusted or needs to be repaired by the calibration laboratory.
In order to ensure the right measurements, it is necessary that the measuring faces of the anvil and spindle are flat and parallel surfaces without any bumps. Bumps and nicks will not secure the object properly so it will give inaccurate results. Make sure that the anvil and spindle are flat and smooth.
Preparation for the Micrometer Calibration
- Make sure that the micrometer and the equipment to be used are already cleaned. It must be free of dirt, dust, rust, and any other foreign materials.
- Worksheets or checksheets must be prepared. It must include the micrometer type, model no., serial no., code or ID (if you have assigned code), manufacturer’s name, size or range of the micrometer, and the technician or custodian name.
- Prepare also the micrometer technical specification and calibration certificate that was issued together with the instrument during the purchase as reference.
- Check and maintain the temperature of the calibration room. Use a thermometer and hydrometer. Ideally, room temperature should be controlled to 20°C or 63°F and at least 30% to 40% relative humidity.
- Store all the micrometers, the gauge blocks, the micrometer masters, the optical flats, optical parallel, and all the measuring instruments needed for the calibration in the calibration room for at least two hours.
General Micrometer Calibration Procedure
The method of calibration of the micrometer discussed below was derived from the different existing standards, American Standard ASME B89.1.13-2013 Standard, the Japanese standard JIS B 7502:2016, and the International Standard ISO 3611:2010.
However, the acceptable maximum permissible error for the parallelism and length measurement for a certain micrometer range that will be discussed and presented along with the defined calibration process is based on ASME B89.1.13-2013.
1. Record the actual temperature and humidity of the calibration room on the worksheet before the calibration. Make sure that the temperature should be within 20°C or 63°F ±2 and the humidity should be within 30% to 40% relative humidity. Monitor also the changes in temperature and humidity during the calibration process. Temperature is a factor of error and measurement uncertainty in the measurement accuracy of a micrometer. Therefore, if there is a big change in the temperature which is not within the mentioned standard, it is better to stop the calibration and start again, than to compute the actual error due to the change in the temperature.
2. Perform the calibration on a surface plate.
3. Perform zero setting check.
Set the micrometer in zero position. Make sure that it is properly set to zero. For a digital micrometer, make sure that the scale is also in zero position. If not, adjust the micrometer to zero using a spanner. Use a micrometer setting standard rod to properly adjust the zero position of the micrometer.
4. Perform a flatness error test for the micrometer whose measuring faces are flat.
Manufacturers of micrometers provide the actual flatness of the measuring faces of the micrometers indicated on their product technical specification or the calibration certificate.
According to ASME B89.1.13, the maximum flatness error allowable should not exceed 1μm (40μin.) for each measuring face. For JIS B 7502:2016, the flatness of the measuring faces of the micrometer ranging from 0mm to 300mm should not exceed 0.6μm, and those above 300mm should not exceed 1μm.
To determine the flatness of the measuring face, place an optical flat against the measuring face of the micrometer.
On ISO 3611:2010, the flatness check of the measuring faces is no longer required.
4.1. Place optical flat on the measuring face to be checked for flatness.
4.2 Wring lightly the optical flat to the measuring face of the micrometer until the no. of fringe will be reduced to a minimum.
4.3 The shape of the measuring face will be determined by the appearance of the fringe. Refer to the illustration shown below.
4.4 Count the no. of fringe and multiply it to 0.32μm to get the value of flatness.
5. Perform parallelism error test if necessary.
As defined in ASME B89.1.13-2013 Standard, the parallelism error of a micrometer should not exceed the tolerance shown in the table below. Like the flatness, a micrometer’s parallelism is also indicated on the manufacturer’s provided technical specification. The actual data of the parallelism may exceed the original parallelism but it is still acceptable if it is still within the maximum permissible error indicated on the mentioned standard.
There are two common methods to conduct the parallelism check.
5.1. Parallelism check using optical parallel and the monochromatic light for outside micrometer.
5.1.1 Place the optical parallel between the measuring faces of the caliper, and gently close the spindle using a normal force.
5.1.1 Look closely at the red interference fringes on the optical parallel.
5.1.2 Count the number of red interference fringes on both measuring faces.
5.1.3 Grip lightly the measuring faces to the optical parallel and repeat step 5.1.2 until such time that the no. of fringe preferably, of the anvil will be at a minimum.
5.1.4 Now, multiply to 0.32μm the no. of the fringe of the side with the highest no. of fringe. This computed value is the parallelism of the anvil and spindle. Each fringe corresponds to a half wavelength of light which is approximately 0.32μm. Take note to use only white light since each kind of light has a different wavelength.
5.2. Parallelism check using gauge blocks or precision spheres.
5.2.1 Select the desired length or nominal size to be measured.
5.2.2 Get the measurement data of the gauge block or the precision sphere on the five partial measuring face point contact as shown in the illustration.
5.2.3 Out of the five data gathered, the highest data from zero is the parallelism of the measuring faces.
6. Perform length measurement error test.
This test was conducted to determine the error from the full measuring face contact.
To decide the test points for each range, there is no standard required by any standard, but there are sample lengths that are recommended on some standards if you will use gauge blocks.
For example, on the ASME B89.1.13-2013, using slip gauges, the mentioned lengths are 7.7mm, 12.9mm, 17.6mm, 22mm, and 25mm for a 25mm micrometer, and for the 1in micrometer, the mentioned lengths to be checked are 0.210in, 0.420in, 0.605in, 0.815in, and 1.000in. It was also mentioned that these test points were derived from the rotation of 0°, 72°, 144°, and 288° positions of the screw.
And for Japanese and ISO standards, the example lengths mentioned are 2.5mm, 5.1mm, 7.7mm, 10.3mm, 12.9mm, 15.0mm 17.6mm, 20.2mm, 22.8mm, and 25.0mm for a 25mm micrometer. These were derived in order for the spindle to be measured at the points that are multiples of the thread pitch.
Apparently, it is better to provide a minimum of five test points for the length measurement checking during calibration.
Below is the standard maximum permissible error for the length measurement error test based on ASME B89.1.13-2013.
6.1 For outside micrometer and inside micrometer
6.1.1 Prepare the gauge blocks that are needed for the lengths to be measured. A micrometer checker can also be used if available. Make sure to indicate on the checksheets the lengths of the gauge blocks used or simply follow the checkpoints if already indicated on checksheets.
For an outside micrometer, mount the micrometer properly on a micrometer stand.
For an inside micrometer, prepare the gauge block holder and half-round jaw and stack the needed length to be measured.
6.1.2 Measure the gauge blocks using the micrometer for each test point. Repeat five times and record on the checksheet.
The actual data indicated on the micrometer should be subtracted from the gauge block dimension. Record the error data.
6.1.2 After measuring all the nominal size, gets all the average of the 5 reading measurements of each size.
6.2 For depth micrometer
6.2.1 Place the gauge block in the granite plate.
6.2.2 Measure the gauge block five times and record it on the check sheet. The actual data indicated on the micrometer should be subtracted from the gauge block dimension.
6.2.3 After measuring all the nominal size, gets all the average of the 5 reading measurements of each size.
7. Evaluate the data and finalize the calibration result.
That is a little guide on how to run a micrometer calibration. Though it doesn’t cover all types of micrometers, hopefully, it can give an idea of what to do; at least you make a check. As said before, depending on your accuracy requirement, your micrometer calibration should be done by an accredited metrology lab, whether it’s onsite or offsite. If you are in search of a new micrometer and don’t want to calibrate it, you can simply buy a calibrated micrometer that comes with a certificate of traceability to NIST. But make sure the lab that calibrates the mic is trusted (you can call them directly) and that product conform with your accuracy tolerance.