Electronic – does an LED always need a resistor?

ledresistors

At school I was told "Never use a LED without a resistor (before or after the LED)!".

Why is that?

If I have a 2V LED why can't I simply use a 2V power supply?

Why should I choose a 3V (for example) power supply and put a resistor in front?

And how to calculate the resistor value?

Best Answer

You don't have to have a resistor to go with your LED.

You have to have a way to limit the current to the LED.

The simplest way to limit the current is to put a resistor in series with the LED. At low current and low voltage differences it works well enough.

If you have a high current LED or a very large voltage difference, then the resistor has to waste a lot of power. That means you have to use a physically large resistor to handle the waste heat, and that your lighting circuit is very inefficient.

For low current LEDs that need to be precisely controlled for brightness, you would use a constant current source rather than a simple resistor.

Many constant current sources use a series transistor to control the current, which wastes power just a a resistor does.

Alternatively, you can use a switching power supply that regulates the current rather than the voltage. This kind of circuit is normally used for high current LEDs where you don't want to waste a lot of power. LED drivers for household lights are often constant current switching power supplies.


For a typical simple LED circuit, you can easily calculate the value of the needed series resistor.

You need the following things:

  1. \$V_f\$ - this is the forward voltage of the LED. You get it from the datasheet of the LED.
  2. \$V_{supply}\$ - this is the voltage of the power supply you are using.
  3. \$I_{LED}\$ - this is the current that you want to let flow through the LED. More current = brighter light - until you exceed the current rating for the LED, at which point it burns out. The LED datasheet will usually have a rated maximum current that you must stay below as well as a typical current rating which will deliver a usable brightness.

Once you have all the numbers together, you can calculate a value for a series resistor like this:

\$R_{series} = (V_{supply}-V_{f})/I_{LED}\$

Since you often don't know how bright the LED will be for a given current, you could calculate the resistor for the rated typical current. You try that out, and use a larger resistor if it is too bright. If it is not bright enough, then you need to use a different LED - don't use a lower value resistor than what you calculated as the higher current will cause the LED to burn out. Maybe not immediately, but certainly sooner than if you followed the manufacturer's guidelines.


You cannot simply use a 2V power supply for a hypothetical 2V LED because that 2V is not exactly 2V on every LED, and it also changes with temperature.

Also, the resistance of an LED changes drastically with the applied voltage. Below \$V_f\$, nearly no current flows. At \$V_f\$, a bit of current will flow. At a couple of tenths of a volt above \$V_f\$, the LED becomes the next best thing to a short circuit.

There is a very narrow voltage range in which an LED would work properly. The best thing to do is to limit the current, and the voltage will work itself out.

Related Topic