For DMMs, accuracy is usually expressed in percent of reading. For example, the meaning of 1 percent reading accuracy is: when the display of the digital multimeter is 100.0V, the actual voltage may be between 99.0V and 101.0V.
Specific values may be added to basic precision in specification sheets. Its meaning is the number of words to be added to transform the rightmost end of the display. In the previous example, the precision might be stated as ±(1 percent plus 2). Therefore, if the GMM reads 100.0V, the actual voltage will be between 98.8V and 101.2V.
The accuracy of an analog meter is calculated based on the full-scale error, not the displayed reading. Typical accuracy for analog meters is ±2 percent or ±3 percent of full scale. Typical basic accuracy of a DMM is between ±(0.7 percent plus 1) and ±(0.1 percent plus 1) of reading, or even higher.
Measure the resistance in the electrical barrier. Resistance values vary widely, from a few milliohms (mΩ) for contact resistance to billions of ohms for insulation resistance. Many DMMs measure resistance as low as 0.1 ohms, and some can measure as high as 300 megohms (300,000,000 ohms). For extremely large resistance, the Fluke multimeter will display "OL", indicating that the measured resistance exceeds the range. When measuring an open circuit, "OL" will be displayed.
When making accurate measurements of low resistances, the resistance of the measuring lead must be subtracted from the measured value. Typical test lead resistance is between {{0}}.2Ω and 0.5Ω. If the resistance of the test leads is greater than 1Ω, the test leads must be replaced.
