I'm having a bit of trouble understanding currrent limiting resistors in simple LED circuits. I know that I can determine the optimal resistor like so:
\$\displaystyle R=\frac{V_{s}-V_{f}}{I_{f}}\$
But I'm having a hard time understanding how this one value modifies the voltage and the current to the correct values for the LED. For example, if my calcuations for a super bright blue LED (with \$V_{f}\$ being 3.0-3.4 V and \$I_{f}\$ being 80 mA, and a voltage source of 5 V) gives me 25 ohms (using the lower bound of the forward voltage), that's fine. So the current throughout should be 80 mA and the voltage drop for the resistor and LED should be 2 and 3 volts, respectively.
But what if I used a 100 ohm resistor instead? Or any other value—how would I calculate the voltage drops and current? Would I assume one of them stays the same?
Best Answer
The LED forward voltage drop will remain (roughly) the same, but the current can change, so the calculation becomes (same equation solving for I):
$$I_{LED} = {(V_s - V_f)\over{R}}$$
So for a 3V \${V_f}\$ and a 5V supply, the \$100\Omega\$ resistor would give \${(5V - 3V)\over{100 \Omega }} = 20 mA\$.
So if you know what current you want, just plug the values in, e.g. for 10mA:
$$R = {(5V - 3V)\over{0.01 A}} = {200 \Omega}$$
Basically, the fact that the supply and the LED forward voltage can be relied upon to be pretty static, means that whatever value resistor you put in will also have a static voltage across it (e.g ~2V in this case), so it just leaves you to find out that voltage and select a resistance value according to the current you want.
Below is the V-I curve of a diode (from the wiki LED page), notice the current sharply rises (exponentially) but voltage stays roughly the same when the "on" voltage is reached.
For more accurate control of the current you would use a constant current, which is what most LED driver ICs provide.