Electrical – Calculating LED current using ohms law

currentcurrent measurementcurrent-limitingled

I've got a 4 digit green 7 segment display (technically 5, it's from a microwave and the "5th" digit is the function lights on the top and bottom). I don't have a datasheet for it. It's common cathode, the first 2 and last 2 cathodes control the digits and the middle controls the function lights. I'd like to figure out an ideal driving current so I can drive it from an Arduino safely. If I assume that each segment takes, say, 10mA, then if that segment on all "5" digits is lit, then the arduino is sourcing 50mA of current through a single pin, more than the recommended max of 40mA. Rather than just settle for no more than 8mA for a total of no more than 40mA, I'd like to actually do some math. I'm driving the display with a buck regulator set to 2.2V for testing

Now, my multimeter sucks. It's one of those free pieces of crap from harbor freight. I like it because it's free and I can abuse (and break) it without really caring because I can get another, but I hate it because it's SO inconsistent with small measurements. It has a 20mA, 200mA and 10A current setting. If I set it to the 10A setting, the maximum resolution it has is tens of mA, and it goes between .01 and .02A. If I change it to the 200mA setting, it says 5.6mA, and on the 20mA setting it says 4.8mA. This is frustrating, so I tried reading the current by converting it to a voltage across a resistor. I used a 100 ohm / .1% resistor, put it in series and connected the black lead from the DMM to ground and the red lead between the LED and the resistor, and read the voltage in millivolts. I got 130mV. So, ohms law: $$\frac{.130mV}{100Ω} = .0013mA$$ …1.3mA?! I can understand the crappy multimeter being bad at measuring current, but it's usually spot on with voltages, which it why I did it like this. I can't understand getting that many different measurements though.

So, my question is two-fold. First, did I do the ohm's law measurements right, or was there something I missed? Second, given that I don't have a constant current LED driver, nor do I have a way of making one right now, how would you recommend I accurately measure the current of these LEDs? I will be driving it from the arduino directly, so I'll be using resistors. And please, be nice. I don't have a lot of money for things, which is why I have a crappy free multimeter instead of a better, more expensive one.

Best Answer

I'm driving the display with a buck regulator set to 2.2V for testing

Don't do that. An LED is not like a resistor. Once over a certain threshold voltage the current increases rapidly, so small changes in supply voltage and/or circuit resistance will cause a large change in LED current. The graph below shows some examples of LED current vs voltage. Here you can see that increasing the red LED's voltage from 1.7V to 1.85V (a mere 9% increase) caused a ten-fold (1000%) increase in current draw!

enter image description here

You should set the power supply to a higher voltage, and limit the current with a resistor in series. The resistor drops the voltage difference between the power supply and the LED, resulting in a current flow according to Ohm's Law, I = V / R (where V is Vsupply - Vled).

For this calculation You can assume that a red LED drops a constant voltage of ~1.9V, orange/yellow ~2V and green ~2.1V (which isn't quite true since the voltage does increase at higher current, but close enough for most purposes). If you want greater accuracy then you will need to measure the voltage drop of your LEDs at different currents.

Your multimeter may suck, but you shouldn't blame it for showing different current readings on different ranges. Most meters read current by measuring the voltage across a low value shunt resistance. If the voltage required for a full scale reading is the same (eg. 100mV) then the shunt resistance value must must be higher on the lower ranges. Since your circuit is very sensitive to series resistance, even the small resistance of your meter shunt is enough to change the current.

When you inserted a 100Ω resistor and measured the voltage across it, you effectively added a large value shunt resistance. The current then dropped very low due to the small difference between the power supply voltage and voltage drop of the LED. The answer to this problem is to keep the large value resistor in the circuit, and raise the supply voltage until you get a reasonable current draw.