Electronic – Does a DMM alter the real value of current flow, when measuring current

current measurementlow-powermultimeter

I am working on a project right now, that utilises the low-power mode of the ATmega328-P. This essentially sleeps the micro-controller when it is not in use. I have a Fluke 79 series ii DMM, which is what I use to measure how much current my ATmega is drawing in its low-power state. It gives the figure of 0.007 mA, or 7 μA.

My question is, does the multi-meter offset the true current draw, as it itself is in the circuit? It isn't usually enough to affect a reading, but is it when the current flowing is this low?

In the Fluke 79ii manual, it states (p23): "Input current through R5+R6 (for mA)…develops a voltage that is proportional to the input."

Further in the manual, it determines R5 as "WW, 9.99, +- 0.25%, 1W, 50PPM", and R6 as "WW, 0.010, +- 0.25%, 1W, 100PPM". Can anyone confirm the units they are, and what PPM means?

Can we determine if the meter influences the current draw, and if so, can we calculate how much?

Thank-you in advance for your help,

Smeato

Best Answer

Making a measurement with a DMM disturbs the circuit you connect it to. Making a voltage measurement draws a load current. Making a current measurement inserts a voltage drop into the path. However, in most situations, the load current tends to be insignificant, whereas the voltage drop can easily be a concern. In either case, it's possible to calculate the effects, and decide you can ignore them, or allow for them.

The typical DMM will have an input impedance of 10Mohm (though some newer very cheap ones have a 1Mohm input resistance) on voltage ranges. The voltage drop when measuring a full scale current tends to be 200mV. For the sorts of voltages (1 to 10) and the sorts of currents (tens of uA to hundreds of mA) that the hobbyist deals with 99% of the time, the current taken by a DMM is insignificant (<0.1%), the voltage drop when measuring current could well be significant (> 1%).

If you want to measure (say) 20uA full scale, the meter will insert a series resistance of 10kohms (10k*20u = 200m) into the circuit. A current of 7uA will create a voltage drop of 10k*7u = 70mV. This is 2% of a 3.3v supply rail.

If you want to check this voltage drop with a second DMM, then the 10Mohms loading across the 70mV of the meter will cause an extra 70m/10M = 7nA to flow round the meter, causing a reading error of 0.1% (7n/7u).

The ppm in your Fluke manual is describing the temperature coefficient of the resistors, in ppm per degree C. The smaller the figure the better, as you not only want your current meter to read the same in different ambient temperatures, but you also want it to read the same when the shunt resistors are heating up due to the current they are carrying. A 50ppm resistor could change value by 0.1% with a 20C temperature change (but will typically change less than that).

Why this assymmetry, that the voltage reading tends to be non-intrusive, but a current reading is intrusive? While it is possible to make an active current measurement with zero voltage drop, that requires an amplifier and power supply in the meter. That's a no-no for battery powered meters, even if low cost amplifiers will get you to 10s or 100s of mA capability. However, there are some Hall based near zero voltage drop current sensors now available, mostly aimed at makers in the Arduino area, using the ACS712. Warning, if you are tempted to buy one of these modules, not all PCBs offered on fleaBay have the same mains-capable isolation that the IC provides. Study the copper clearances in the photographs carefully, if you need mains isolation only buy one with large clearances.