Electronic – How to limit current to LEDs

constant-currentcurrent-limitingled

I have understood that this is a bad way to do it:

Since some LED can draw more current than others.

So lets say I would power some high powered LEDs, like say 10 x 3W LED with a DC forward voltage on 600mA @ 3.2V. This is just an example so I'm not going to look up a specific LED, this is only to learning.

Which is the best way to secure that all LEDs get the same voltage and current?

Additional questions

Let's say I have a power supply that delivers 2A at 3.6VDC, and I have a LED with forward current 350mA at 3.6VDC, can I connect the LED directly to the power supply without resistor, since the power supply delivers 3.6VDC?

Best Answer

No, that's the right way to do it. Each LED having it's own ballast/series current limiting resistor, sized in Ohms and Wattage to fit the desired forward current of the LED and the Voltage source.

The "wrong" way to do it would be with only 1 resistor for multiple LEDs in parallel.

But if you are going with 3 Watt leds (That's 940mA at 3.2V, not 600mA.), then a resistor is not a good choice. You should look into constant current sources. Using a resistor would require power resistors that can handle high wattage. At 5V source, that's 1.8V * .94 Amps = 1.62 Watts. A 2 Watt resistor is needed. For each led. That's a lot of wasted power, 65% efficiency.