Accuracy and precision are some of the most fundamental things in measurement system. These two terms represent how good a certain measuring/sensing instrument. It determines how appropriate a measuring instrument to a particular application.
However, in some cases, it’s kind of confusing to differentiate them. Sometimes people say, it’s a high precision instrument. On the other hand, it’s a high accuracy instrument. The term precision sounds like accuracy, vice versa.
If you are just a beginner, learning these two terms are highly encouraging so that you can choose the right term for the right situation.
So, What is Accuracy?
Accuracy is a term to describe the level of closeness between a measured value to a true value. The closer the measured value to the true value, the higher the accuracy. Conversely, the farther the measured value to the true value, the lower the accuracy.
You could imagine an arrow. The spot that you hit is the measured value, while the center of the target is the true value. The distance between the spot and center determines the level of accuracy. The closer it is, the more accurate it is.
In real measurement, the measured value is the reading obtained from an instrument, while the true value is the defined value universally accepted.
For example, your tape measure and the defined/accepted value of 1 meter. Your tape measure shows a reading of 1 meter. The defined value of 1 meter is the distance of light travel in a vacuum in 1/299,792,458 of a second. To claim that your measuring tape is 100% accurate (high accuracy), its 1-meter must be equal to that 1-meter distance of light travel.
In reality, however, you cannot compare tape measure with light. You will use more practical measurement standards such as gauge blocks.
So, What is Precision?
Precision is how close the variation of measurement results in each other. A collective of repeated measurements can show this variation’s closeness.
It doesn’t matter whether the measurement results are close to the true value or not. Precision’s matter is its consistency. The smaller the difference among the results, the more precise it is.
The idea behind precision is to assess whether you get the same result under the same or different specified conditions.
Your instrument may generate significant differences among the varying results when operated by the same operators and under the same environmental conditions. This is low precision. It can occur due to a broken instrument, etc.
Under different conditions (temperature, humidity, etc), the instrument shows significant differences. This is low precision as well. Perhaps, it’s made from steel instead of stainless steel. Steel has higher thermal conductivity. The heat transfers from your hand to the instrument. Some instruments such as micrometers suffer from this event and are very impactful to the measurement.
Ensure to differentiate the precision instruments and precision measurements. They should separate. The instrument is only an element of the measurement. You cannot blame the instrument if the measurement procedure is inappropriate/invalid. Read further about repeatability and reproducibility.
Accurate and Precise
What if all of your arrows hit the center of the target? That is called high accuracy and high precision. When high accuracy and high precision combine all together, you get an accurate shot.
The same thing with your instrument. If its measurement result is very close to true/actual value and the repeatable results show very small differences even no differences, you get an accurate measurement.
In Summary
Accuracy and precision are separated terms but they are unseparated when it comes to quality measurements. Accuracy depends on the true/known value. Precision depends on its own measured values.