Electronic – How to expect the low end of the multimeters resistance measurement range to be

multimeterresistance

I have an old Fluke 79-III and I'm trying to check the resistance of some resistors which should be 0.22 Ohms. Unfortunately, they are all reading 0.6 Ohms and so I'm wondering how accurate I should be expecting the main resistance function of my meter to be.

The specs from the Fluke 79-II user manual (p39), which agrees with the Fluke 79-III instruction sheet (p11) gives us:

| Function | Range      | Resolution | Accuracy      | Burden Voltage (Typical) |
| Ohm      | 400.0 Ohm  | 0.1 Ohm    | ±(0.4%+2)     | Not applicable           |
|          | 4.000 kOhm | 0.001 kOhm | ±(0.4%+1)     |                          |
| 40 Ohm   | 40 Ohm*    | 0.01 Ohm   | 5% Typical*** | Not applicable           |
| *   In 40 Ohm and 40mV ranges, thermals may introduce additional errors. To   |
|     maximize accuracy, keep both probe tips at similar temperatures.          |
| *** Accuracy applies after lead resistance compensation.                      |

I believe that I understand the resolution to mean that while I shouldn't expect 0.22 ohm, I should at least expect the meter to read 0.2 or 0.3 ohms.

Edit 1) I don't understand the accuracy rating though. Does the +2 mean that I should expect it to read high by up to two in the least significant digit? Is the 0.4% accuracy 0.4% of the 400 ohm range (i.e. 1.6 ohms), or 0.4% of the current reading?

Looking closer, the spec from the Fluke 79-II user manual confirms that:

Accuracy specifications are given as: ± ([% of reading] + [number of least significant digits])

So B Pete's answer of ±0.2 ohms looks good.

Edit 2) Measuring the resistance of the leads by shorting them gives 0.3 ohms, so the 0.6 ohm I originally measured is well within the accuracy envelope of 0.22±0.2 ohm + 0.3±0.2 ohm.

Edit 3) Also, I did originally try the meters Lead resistance compensation (40 ohm) mode (instruction sheet p6) and measured 0.15 ohm, so I discounted it as also being inaccurate.

With only 5% accuracy, this should be correct to ±0.011 ohms.

Checking again now however, I see that when I do a Lead resistance compensation, the longer I keep the range button pressed it, the more stable it becomes, converging on a reading of 0.05 ohms. Measuring the resistor, it now shows 0.17, sum these and (ta-da) we get 0.22 ohms.

Not bad for a 10+ year old meter that hasn't been calibrated in at least 6 years. *8')


Ultimately, should I expect my multimeter to get close to being able to measure 0.22 Ohm, or should I not expect much accuracy below a few ohms?

Best Answer

The Fluke meters I am familiar with are specified as +/-(given percent of measurement + given number of least significant digits). For the given info, using the 400 ohm range, this yields +/-(0.004 * 0.22 + 2 * 0.1) = +/-0.20088 or approximately +/-0.2 ohms, if the meter is accurately calibrated.

When measuring low resistance values, especially at the sub 1 ohm level, you will need to consider the impedance of your probes and the connections to the meter and the resistor. If you make a resistance measurement with your probes shorted together, you can get an idea of the resistance of you probe connections.

To more accurately measure sub 1 ohm resistors without using a Wheatstone bridge or other indirect measurement methods, I would recommend using a bench DVM with a 4-wire (Kelvin) resistance measurement capability. This will remove the resistance of your probes from the measurement.