How to choose what a power supply to drive 2 x 10watt RGB LEDS

arduinoledpower supplypwm

I'm trying to figure out how to determine what power supply I should be using for different types of LEDs for this and future projects.

I have 2 x RGB LEDS:

Forward Current: 350mA

Forward Voltage:

  • RED: 6V – 6.6V
  • BLUE: 9.6V – 10.2V
  • GREEN: 9.6V – 10.2V

I want to run these in parallel with a power supply and was thinking something along the lines of using an LED Strip power supply as my main power source that is 12V – 2Amp – 25 watt.

I'm going to use an arduino and attempt to dim them using PWM with a transistor.

What's the best way to gauge a power supply unit for high powered LEDS including putting them in parallel?

My understanding is that if I'm drawing 20 watts of power, ideally a power supply should be able to output slightly more than you're drawing. The same goes for the amperage?

Is using this LED Strip power supply a bad idea?

Best Answer

I think you should go up to the next largest power supply size or use the LEDs at less than full current. For example, if you used them at 275mA, the total draw would be 1.65A.

The current is the important factor since you're going to be wasting significant power in the series resistors. You should use six series resistors, one for each LED string. If the forward voltage of the red LEDs is 6.3V you can use (12-6.3)/0.275 = 22 ohms at 2W (0.275a * (12-6.3) = 1.7W) (two of those) and the other values would be calculated in the same way.

As Funkyguy says, it's not a good idea to run the power supply or the LEDs at full rated power (often there is an explicit derating chart for temperature or whatever, but you may not get that with no-name Chinese supplies, and you may not have a full datasheet for the LEDs).