Electronic – Is there a point in adding a 1 Ohm resistor to this LED circuit


I am trying to figure out an appropriate resistor value for the following circuit.


simulate this circuit – Schematic created using CircuitLab

Input voltage – 3.3v
Blue LED, forward voltage 3.3v
My maths tells me that driving that LED at say 12ma, means that the resistor should be:

3.3-3.3/0.012 == 0

Checking with some LED resistor calculators, they suggest a 1 ohm resistor : http://led.linear1.org/1led.wiz

Given my voltage drop, and input voltage, what does the addition of this 1 ohm resistor actually make to the circuit?


Best Answer

If you want to predict the current through an LED with that little headroom, you need a better model of the LED than "forward voltage is 3.3 V".

You can see that this device has a typical differential resistance (at the operating point) on the order of 30 ohms, so the 1 ohm series resistor is probably not doing much:

enter image description here

On the other hand, what the graph doesn't show is that this curve likely moves around quite a bit if the device temperature changes.

I'd recommend either using a higher supply voltage so that a reasonable series resistor can be chosen, or treat the 1-ohm resistor as a placeholder to be adjusted once you've built the circuit and figured out what resistance is needed to avoid over-stressing the device at the worst-case temperature (and be prepared for pretty big changes in brightness with temperature).

(Also notice that the specified maximum forward voltage at 20 mA is 4 V, so you might have some trouble with device-to-device variation)