Electronic – Rule of thumb for LED Ballast Resistor

ledohms-lawresistors

As well all know, using an LED without some sort of current control is a bad idea. One of the most basic types is a simple Ballast Resistor, paired with Ohm's Law. Based on Vin, Vf, and the desired Current, we can determine the resistor needed.

But in broad strokes, what is the best scenario for choosing a resistor? Given an LED with 20mA at 3.3V Forward Voltage, with a very close (Variable) source voltage, how much headroom does the resistor need for optimal control?

If we provide too much headroom (Vin of 12V) for a single 3.3V LED, we are just wasting energy and require a higher wattage resistor. If we provide little head room (Vin of 3.4V), we give the LED only 0.1V, is that enough? Changes in Vf and If due to temperature look like it would cause a looping cause and effect situation.

So the question is, for optimal control, how much headroom in terms of voltage, should the resistor need?

Best Answer

This is something that is dependant on the LED and power source, as well as the operating conditions. If you want a cover-all rule of thumb, assume at least 0.5V, or go if you need absolute reliability for 1V.

Of course, if you want absolute assurance you are not wasting too much energy and still not damaging the LED, you can:

  1. Get all the datasheets together, inspect the LED's power curves at all the possible temperatures, then compare that to whatever voltage regulation you intend to apply, choose the worst-case scenario and model your resistor for that, accepting that in the best case the current may drop noticably.

  2. Design a switching current-mode driver. There are a million and one LED driver chips in the selections of Linear Technology and it's companion companies (Probably TI and AD have some as well, for example) that drive a LED through switching a coil while monitoring the LED current with a small sense resistor, often only "wasting" 100mV or even less.