Led voltage drop resistor

led

So I was working on the switch backlighting for A switch plate on my boat. There is a single led mounted in a light bar that mounts behind the engraved legend plates.

I bought a replacement led at RadioShack. It is a red 800 mcd LED. The package says 1.7V – 20mA.

I needed to calculate the voltage drop resistor. So I took incoming voltage of 12v – 1.7v= 10.3v.

10.3v / .02A = 515 ohm resistor. Right?

So I buy a 560 ohm resistor and wire it in series with my led. Luckily I am able to bench test with a variable power supply and a volt meter. As I bring the supply voltage up to 12v, the voltage at the led rises over the led 1.7 volts. Why?

What I ended up doing was wiring 2 – 560 ohm resistors in series and that brought the led voltage to where it needed to be. Why did Ohms law not work? What did I do wrong or not account for? Could the LED current be labeled wrong on the package?

Best Answer

You've gotten fooled by the ratings. What is important in LEDs is not voltage, per se, but rather current. Specifically, the forward voltage is a typical number, and is not in any way a critical limit - it's just there for reference, and should be seen more as a minimum required for operation, rather than a maximum. In this case, 1.7 volts isn't terribly important (since it's a really, really bad idea to drive LEDs at a constant voltage - they tend to die). What is important is that the LED operate at less than 20 mA.

Assuming your 12 volts is in fact 12 volts exactly (and you did check this, I hope), a 1.9 volt reading on the LED means the voltage across the resistor was less than your expected 10.3 volts, and in fact was 10.1 volts. This in turn means that the current was less than 20 mA, nominally $$i = \frac{10.1}{560} = 18 \text{mA}$$ and you were in good shape. Especially since your resistor may well have had a 10% tolerance, and so your current had the same, and could have been as high as 19.8 mA.