Electronic – Powering high power LEDs without resistors

high-powerledpower supplyresistors

I have a bunch of 3W UV LEDs which take 3.6V@0.7A each approximately.

I need to power 50 of them since I need about 150W total power.

My power supply is 12V, so I need to lower that down to 3.3-3.6V to power the LEDs without breaking them.

If I use the standard resistor-diode circuit, it would be something like this:

schematic

simulate this circuit – Schematic created using CircuitLab

Calculations:

Vp = 12V  - PSU voltage
Vd = 3.3V - diode voltage
Id = 0.7A - diode current

Vr = Vp - Vd = 12V - 3.3V = 8.7V
R = 8.7V / 0.7A ~= 13Ω

Pr = R * Id^2 = 13Ω * (0.7A)^2 ~= 7W - resistor power

If my calculations are correct, I would need a 13Ωx7W resistor for each led, and the circuit would draw approximately (7W + 3W) * 50 leds = 500W.

Now, I don't have a PSU which can provide that much power, high power resitors are quite big and I think there is a more modern approach to this problem I don't know about.

Is there a way to avoid using resistors to regulate voltage on high power LEDs?

Best Answer

The key to using a current limiting resistor is that is must be large relative to the load. Resistors have a relatively predictable and linear change in resistance with rise in temperature. LEDs on the other hand have a non linear change with temperature rise. They get hot, draw more current & get hotter. Given enough power, they simply burn out. Your equations are basically correct but the method takes a bit of trial an error. It will give you a good place to start but you have to tune the resistor at steady state (After everything has had a chance to warm up). Your resistor choice is large compared to the LED and that's good, but it's inefficient. It makes controlling the LED in this way more stable, but you waste just over 2/3rds of your power.

A better approach is to put several in series to get closer to the 12V and not have to lower the voltage so much with the LEDs. With a little trial and error, this approach will work, but it's hard to maximize the LEDs. You would tend to under power them a little, to be safe, with this method.

A better approach would be to add a constant current drive. You can still use your power supply, but then add an LED driver circuit in series between the power supply and the LED. I don't know what access you have to making PCBs or if that's even an option for you. If not, you can search for "Constant Current DC DC converter" and find some pretty cheap for what you are doing. Here is one for about $5 that would do 250W DCDC CC Converter. I suspect that is the easiest option. Another would be to add a constant current driver per set of 3 LEDs, something like a CAT4101 by ONSemi. enter image description here

All these ways will work, you just need to pick the one that meets your requirements best. I would buy the CC driver. If you go with the resistor route, remember to start high, test with 3 in series, let them get up to temperature and check how much current they are using. To do that, just measure the voltage drop across the resistor and use V=IR to find your current. That should be pretty close (resistance on the resistor will have changed a little based on temperature). Your resistors are dissipating a lot of heat, you may want a fan or other heat sink.

Good luck.