Electronic – Accuracy when measuring current (mAmps) and small voltages

current measurementmultimeter

I'm looking at a circuit (part of a bigger circuit) that consist of a 10 ohm resistor and a red LED.

  • It's running on 2.05V (checked with 2 different multimeters, ranging from from 2.047V to 2.05V).
  • The voltage drop of the LED (again
    measured with 2 different multimeters on the live circuit) is 1.96V
  • The resistor clocks
    in at 10.0 ohms exactly (again measured with 2 multimeters).

Now, according to Ohms law, the current flowing through the LED should be

I = V / R = (2.05V – 1.96V) / 10 = 9 mAmps

However, when hooking up my multimeters I get the following readings

  • Multimeter A : 7.30mA
  • Multimeter B : 6.10mA

It took them about 30 seconds to stabilize.

Why am I seeing that much difference ?

Multimeter A is about 4x more expensive than multimeter B, so I guess it will be more accurate, however, the difference between the measured 7.3mA and the calculated 9mA is too big to ignore no ? Or am I missing something ?

Best Answer

Your multimeters will have round about 1 or 2 ohms input impedance. They have a small (but not zero) impedance so they can develop a voltage across it which can be measured. In other words they "infer" current by the voltage developed across the small resistor and this small resistor is affecting the current through the LED.

Both meters will have slightly different resistances and hence the different readings.