Electronic – Powering LED Strips with a Constant Voltage Supply and Constant Current Driver

constant-currentled strippower supply

I have a LED strip (16ft Hitlights Luma5 warm-white high-density) powered by a constant voltage supply. I'm considering adding PWM dimming with a Maxim MAX16820 constant current driver. The PWM signal is generated by a microcontroller monitoring a pot for brightness control. One difference from the reference circuit is that the LED strip contains parallel sets of three series LEDs with a current limiting resistor.

Some sources suggest that a hybrid architecture with a constant current driver and a constant voltage supply are beneficial (e.g. this reference).

In my case, is there anything that makes this a bad topology, or traps to consider? And what are the benefits of this setup?

Best Answer

The point of using a constant current driver is to avoid the use of less efficient current limiting items (such as resistors). Since you already have resistors in the LED strip acting as the current limiter, adding a constant current IC and other necessary components would be more detrimental than beneficial in your application by adding not only the complexity but also the inefficiency of both methods. Also with a constant current driver you will be unable to change the length of the strips you are using without having to readjust the constant current. With a constant voltage you can just add or remove strips or length to the same driver provided it can handle the load, since the current limiting will be embedded in the strip. Using a fet to drive the strips is a common method, if you are worried about this method of control.