Often the terms resolution, precision, and accuracy are used interchangeably, but they actually indicate very different entities, as we discussed in the previous section. Although common sense indicates that a 6 1/2 digit multimeter must be accurate to the 6 1/2 digit level, this may not be the case. The number of digits simply relates to the number of figures that the meter can display and not to the minimum distinguishable change in the input (refer to Digits Displayed and Overranging definition above). Therefore, when using or evaluating an instrument, remember that the number of digits may refer to the display and not directly to the resolution of the instrument.
You need to verify that the instrument sensitivity and effective resolution are enough to guarantee that the instrument will give you the measurement resolution you need. For example, a 6 1/2 digit multimeter can represent a given range with 1,999,999 counts or units. But if the instrument has a noise value of 20 counts peak to peak, then the minimum distinguishable change must be at least 0.52 ´ 20 counts. Referring to equation (4) above, the effective number of digits is:
Often, this error is added in the DMM specifications under Percent Range, where the source of the error -- nonlinearity, noise, or offset -- is not identified.
This aspect of the technical specifications relates to the first digital multimeters, which had a limited number of digits displayed to keep the cost of the instrument as low as possible. With the advent of more sophisticated digital instruments and, ultimately, of virtual instruments, the cost of the instrument display is no longer an issue. Therefore, care must be taken in specifying the number of digits of a measurement device (whether computer-based, PXI/CompactPCI, VXI, or GPIB controlled). The resolution, accuracy, nonlinearity, and noise of the measurement device must be considered when determining the number of digits to display to the user. For example, consider an instrument that uses a 24-bit analog-to-digital converter (ADC) and can display seven digits of data (seven 9s). However, if the six least-significant bits are noisy and thus not carrying any valuable information, the resolution of this ADC is reduced to 18 bits (five digits) and the instrument vendor should display no more than five digits.
The NI 4050 and NI 4060 digital multimeters and the NI 4350/4351 temperature and voltage instruments are based on a 24-bit ADC and provide 24-bit data (7 digits). However, following these guidelines, the information returned to the user has been voluntarily limited to 5 1/2 digits (18.6 bits) to maintain the correct relationship between the number of digits displayed and the effective resolution of the instrument. Because National Instruments has adopted the definition of resolution listed at the beginning of this document, the number of digits displayed matches the instrument resolution.
Sometimes it is difficult to make a clear distinction between precision of the instrument and its accuracy. Precision,which relates to the repeatability of the measurement, is determined by noise and short-term drift of the instrument. (Long-term drift of an instrument affects precision only if it is considered over extended period of time.) The precision of an instrument is often not provided directly, but it must be inferred from other specifications such as the 24 hour ±1 °C specification, the transfer ratio specification, noise, and temperature drift. Precision is meaningful primarily when relative measurements (relative to a previous reading of the same value) need to be taken -- a typical example is device calibration.
Accuracy of an instrument is absolute and must include all the errors resulting from the calibration process. It is interesting to note that sometimes the accuracy specifications are relative to the calibration standard used. In such a case, it is important to include in your error budget the additional errors due to this calibration standard.
In the following section we will go through an error budget calculation to determine the instrument total accuracy.
DC Measurement
Error Type
|
Reading-Dependent Errors
|
Noise and Range-Dependent Errors
|
Specified Accuracy | % of reading x reading/100 | offset |
Nonlinearity | ![]() |
% nonlinearity x range/100 |
System Noise | ![]() |
rms noise x 6.6 (to find peak-to-peak value) |
Settling Time | % settled x Step change/100 (this must be added for scanning systems unless it is included in the accuracy specifications) | ![]() |
NM Noise | ![]() |
Normal Mode Noise x 10^(-NMRR/20) |
CMV | ![]() |
Common mode Voltage x 10^(-CMRR/20) |
Temperature Drift (this must be added if your temperature range is outside the specified range for the given accuracy tables) | (% of reading/°C) x X °C x reading/100 X is the temperature difference between the specified temperature range and the actual operating temperature. |
(offset /°C) x X °C X is the temperature difference between the specified temperature range and the actual operating temperature. |
Let us consider the National Instruments NI 4350 Series 5 1/2 digit Temperature and Voltage Data Logger. We will calculate the total accuracy of a 1 V reading. Let us also assume that the instrument is at an ambient temperature between 15 and 35 °C, and it has been less than a year since the last calibration was performed but more than 90 days. The total accuracy based on the error budget determined above is:
Error Type
|
Percent of Reading Errors
|
Range-Dependent Errors
|
Specified Accuracy | 1 V x 0.0131% = 131 mV | 3 µV |
Nonlinearity | ![]() |
0, included in accuracy |
System Noise | ![]() |
0, included in offset |
Settling Time | not needed as specification table included any errors due to scanning | ![]() |
NM Noise, assume 1 mVrms of environmental noise | ![]() |
1 mV x 1.4 x l0^(-100/20) = 0.01 µV |
CMV, assume maximum CMV of 2.5 V | ![]() |
2.5 x 10^(-100/20) = 25 µV |
Temperature Drift | N/A (because the NI 4350 specification tables cover 15 to 35 °C) | N/A (because the NI 4350 specification tables cover 15 to 35 °C) |
Subtotal | 131 µV | 28.01 µV |
Total Maximum Error | 159.01 µV or 0.016% of reading |
AC Measurement
Error Type
|
Reading-Dependent Errors
|
Range-Dependent Errors
|
Specified Accuracy at a Given Signal Frequency Range | % of reading x reading/100 | offset |
Nonlinearity | ![]() |
% nonlinearity x range/100 |
System Noise | ![]() |
rms noise x 3.46 (to find peak to peak value x assume gaussian noise) |
Settling Time | % settled x Step change/100 (this must be added for scanning systems unless it is included in the accuracy specifications) | ![]() |
CMV | ![]() |
Common mode voltage x 10^(-CMRR/20) |
Temperature Drift (this must be added if your temperature range is outside the specified range for the given accuracy tables) | (% of reading/°C) x X °C x reading/100 X is the amount of temperature drift from the specified temperature range. |
offset /°C x X °C X is the amount of temperature drift from the specified temperature range. |
Crest Factor Error | X x reading/100 Add X% additional error based on the type of waveform. |
![]() |
Let us consider the NI 4050 5 1/2 digit multimeter and calculate the total accuracy of a 1 Vrms reading. Let us also assume that the instrument is at an ambient temperature between 15 and 35 °C, and it has been one year since the last calibration was performed. Because this is an AC measurement, we need to specify the frequency of the signal measured and the crest factor. Let us assume a frequency of 1 kHz and a crest factor of 2. The total accuracy based on the error budget determined above is:
Error Type
|
Reading-Dependent Errors
|
Range-Dependent Errors
|
Specified Accuracy at a Given Signal Frequency Range | 0.42% x 1 V = 4.2 mV | 1.2 mV |
Nonlinearity | Included in specification table | Included in specification table |
System Noise | ![]() |
Included in specification table |
Settling Time | This is not a scanning DMM and let us assume that the signal is not changing | ![]() |
CMV assume maximum allowable CMV of 250 V | ![]() |
250 ´ 10^(-100/20) = 2.5 mV |
Temperature Drift | Not applicable because the temperature range is included in the specification table | ![]() |
Crest Factor Error | 0% x 1 V/100 = 0 mV | ![]() |
Subtotal | 4.2 mV | 3.7 mV |
Total Maximum Error | 7.9 mV or 0.79% of reading |
When considering a DMM or a virtual instrument, it is important to have a clear understanding of all the parameters involved in defining the characteristics of the measurement device. The number of digits listed in the data sheet is an important piece of information, but it should not be considered the ultimate or only parameter to take into account. By knowing the accuracy and resolution requirements for your application, you can compute the total error budget of the measurement device you are considering and verify that it satisfies your needs. Do not hesitate to ask a vendor to clarify the meaning of the specifications in a data sheet. Not knowing the true performance of your instrument could lead you to incorrect readings, and the cost of this error could be very high.
Customers interested in this topic were also interested in the following NI products:
For more tutorials, return to the NI Measurement Fundamentals Main page.