Electronic – LED resistor on 3.3V source – what happens to LED current/voltage drop

driveledresistors

This may be very basic here but for some reason I can't seem to wrap my head around it.

I've got a chip delivering 3.3V to drive a LED. The LED has a typical 3.2V voltage drop and max 3.5V.

First off, yes I can pick another LED but now I'm curious.

Assuming a 3.2V drop at 20mA (typical for LED) I would need a 5ohm resistor in series.

Now the thing that might be so basic but I'm not sure about:

When I increase the resistance, will the current decrease or will the voltage drop across the LED decrease, or both?

I know that most likely my LED will shine less brightly but that is not the problem. It might be half for all I care about. What will happen if, for instance, I were to use a 400ohm resistor?

Please could it be elaborated on why the one happens or the other?

EDIT:
Data sheet of the LED.

Best Answer

The tricky thing about LEDs is that they are current driven rather than voltage.

Your LED datasheet has this chart:

enter image description here

You could read it as "for voltage X, current Y will flow through the LED."

It is, though, more useful to think of it as "for current Y, X volts will appear across the LED."

If you try to control the LED current using the voltage, it won't work well. You will be constantly adjusting the voltage to keep the current at the level you want.

It is far easier to use a constant current source and ignore the voltage.

An ideal current source provides the same current, no matter what voltage is required to make that current flow.

You can approximate a current source (which you wouldn't normally have at hand) by putting a resistor in series with a voltage source (which you almost always have available.)

To get back to your question, your series resistor and voltage source make a current source that delivers a certain maximum current. That current applied to the LED will cause a certain voltage drop across the LED.

The approximation "series resistor and a voltage source" as a replacement for a current source works better the higher the voltage is. The closer the voltage source voltage is to the forward voltage of the LED, the less it behaves as an ideal current source.

If you use 100VDC and a 20 kiloohm resistor as your current source, then the foward voltage of your LED doesn't matter. Using the usual formula for the series resistor for 5 milliamperes, a 3 V LED, and a 100V source, you come up with 19.4k. Alternatively, 97V divided by 20k comes out as 4.85 milliamperes instead of 5 milliamperes. The differences are too small to care about.

Now if you try the same thing at 3.3V, you will find that a small change in the resistor, or the source voltage, or the forward voltage makes a large difference in the current.

This is why you normally try to keep the source voltage well above the LED forward voltage.