Electronic – different current readings with a multimeter between the 200 mA and the 10 A slots

currentmultimetervoltage

I have a cheap multimeter, which gives me different readings between the two inputs. I would like to set 100 mA using a potentiometer in a circuit, which runs from two AA batteries. If I measure the circuit using the standard slot which works up to 200 mA, it shows about 80-90 mA. In the same circuit, if I plug the cable into the 10 A slot, and switch the multimeter to the 10 A mode, it shows 0.11 A.

Could you tell me why is it happening and how should I know that:

  1. is my multimeter actually modifying
    the current, or just showing it?
  2. if not, then which measurement is the
    correct?

As a small question: if I buy a new multimeter, what should I look for if I want to buy a better one? Should I buy one from a known brand? Is there any way to identify the ones I should avoid?

Update:
I think I should tell you what is my circuit:
2xAA batteries —- 10 Ohm —– IR LED (specified at 1.35V/100mA) — multimeter —- batteries

Thank you for all the detailed answers! Based on markrages' answer, I think in my case the multimeter is indeed modifying the current, because

  1. without the multimeter its 10 Ohms
  2. with the multimeter in 10 A mode its 10.1 Ohms in total.
  3. with the multimeter in 100 mA mode its 12.5 Ohms in total.

Which would prove the fact why am I seeing 80-90 mA vs. 110 mA in the two modes, as thats the same 25% difference as the difference in resistance between 2. and 3.

Even more extreme was when I tried to run the 1.35V LED from a single AA battery. In that case I used a 1 Ohm resistor, and the two readings in the multimeter were 35 mA vs. 100 mA. Now I understand why, because the multimeter + 1 Ohm acted as a 3.5 Ohm resistor.

Best Answer

Measuring the cheap meter on my desk, the 10A range uses a 0.1 ohm shunt resistor and the 400 mA range uses a 2.5 ohm shunt resistor and the 4mA range uses a 100 ohm shunt resistor.

100 mA through 0.1 ohms is 10 mV drop and 100 mA though 2.5 ohms is 250 mV drop. So depending on the impedance of your circuit, the current could be lower in the lower range just because of the increased series resistance.

You can measure the voltage drop across the ammeter with another voltmeter.

When you can't tolerate voltage drop, try a feedback ammeter: http://www.keithley.com/data?asset=6169