Electronic – RGB LED Strip – Variable Voltage Vs. PWM

atmegaavrledled stripmicrocontroller

I am going to install an Analog RGB (non adressable) LED strip in my room and need to make a driver for it.

The LED strip specs are:

  • 10 cm segment
  • 12V @ 60mA max per segment

I would be using 330cm of the strip (33 segments => 2A max => 0.7A max per channel)

My initial thought was to use a microcontroller with 3 PWM channels for red, green and blue. But then I realized I could probably get away with using 3 variable resistance to provide variable voltage to the 3 channels and the color can then simply be changed by altering these resistors.

Would this way be okay ? After all PWM does the same thing … generate analog voltage levels from digital.

The only thing i can think of is that the variable resistors need to be able to handle that much current (vs PWM solution where a mosfet/bjt would take care of it).

Any thoughts ?

Best Answer

Using variable resistors would work, but could be tricky to implement. For example, each segment has say a 130 ohm resistor per channel. So for 33 segments in parallel this is effectively a resistance of 130 / 33 ~= 4 ohms. So to halve the current for that channel you would need a single 4 ohm resistor, able to dissipate 0.7 * 4 = 2.8W. In a quick search on element14 I couldn't find a cheap potentiometer with this kind of power rating. You could use a pot to control a power transistor, but why not just go to PWM for that effort. :)

PWM is more power efficient. Here is a tutorial on getting is running using an Arduino and PWM. RGB LED Strip - Variable Voltage Vs. PWM