LEDs in parallel, strange current consumption

currentled

I just wired a blue LED on a breadboard with the anode to an Arduino Uno's 3.3V rail and the cathode to a 220 ohm resistor to ground. Then I measured the current consumption using my Fluke 87-V and it was 1463 microamps.

Here's the strange part: I then added another blue LED, exactly the same type, in parallel to the first one. This, too, was using a 220 ohm resistor. I expected the current consumption to double to 2926 microamps or so. Instad, the measured current consumption was 2121 microamps. Can anyone explain this?

Update: Just added a third blue LED, same type, in parallel, 220 ohm resistor, and the current drawn is now 2681 microamps. Wtf?

Update 2: I just switched my Fluke to the mA range and now I get 5.41 mA drawn with 3 LEDs in parallel. With two in parallel on the mA range I get 3.61 mA and with just one I get 2.02 mA. All measurements in DC mode of course. So I assume my Fluke is broken since it gives different readings on the uA and mA modes?

Best Answer

There's nothing strange going on here.

With a 3.3V supply, and a blue LED, there is very little voltage across the 220 ohm resistor(s).

schematic

simulate this circuit – Schematic created using CircuitLab

Your Fluke meter (when set to measure current) has quite significant internal resistance, which is higher on lower current ranges. This resistance adds to the 220 ohms/n (where n is the number of LEDs), and explains the lower reading on the lower current ranges. There are times when you can ignore the voltage drop of an ammeter (or the loading of a voltmeter) but this is not one of them.

Another effect is that the output port pin also has some internal resistance (which again adds to the parallel resistors) so you expect the current to drop if you're driving the LEDs with a single port pin.