Search waiting | Matsusada Precision

Searching...

Technical Terms

The Digit, in the context of measurement instruments like power supplies and multimeters, refers to a measure of the display's resolution or the instrument's precision. When an instrument's display resolution is specified, it is often given in terms of digits, such as "4 ½ digits". This notation describes the number of full digits the display can show, plus the range of the most significant (leftmost) digit.

A "full digit" is one that can display any value from 0 to 9. A "½ digit" means the most significant digit can only display a 0 or a 1. Therefore, a 4 ½ digit multimeter can display values from 00000 up to 19999 (giving a total of 20,000 counts). Similarly, a 3 ¾ digit display has a most significant digit that can go up to 3 or sometimes 4, providing 4,000 counts. The term "digit" is also used when specifying the accuracy of an instrument. An accuracy specification might be stated as "±(0.05% of reading + 3 digits)". Here, "3 digits" refers to the value of 3 counts of the least significant digit (LSD) of the measurement.

For example, if a 4 ½ digit meter is on the 20V range and reads 10.000V, the least significant digit represents 0.001V. An error of "3 digits" would correspond to an uncertainty of ±0.003V. This part of the accuracy specification accounts for internal noise and conversion errors in the instrument.

Information on related articles in Technical Knowledge