Electrical – 10W LED, no need for constant current

ledled-driverpower supply

I've got few of these LEDs which are in a 3S3P configuration and rated 9-12V 900-1050mA.
I tried to power one of these with this module which I earlier set at 12V and 1A with a multimeter in short circuit.

The problem is that when I connect the LED in series with the multimeter, the latter tells me that is drawing less than 400mA.

So, out of curiosity and willing to fry one LED, I connected it directly to a power supply (a laptop brick, 12V 3A) and the LED doesn't draw more than 600mA.
More LEDs in parallel gives me these values:

  • 2 LEDs: 1200mA
  • 3 LEDs: 1600mA

I'd like to push the rated current through the LEDs (so to take them to full brightness), also is weird that seems like they're current-limiting themselves without the need of a constant-current circuit.

Does anyone have any explanation/experience with this behaviour?

Thanks

Best Answer

You have not mentioned the color of the LEDs. The typical white LEDs that make up modules like yours would take more than 3V to reach maximum rated power. I would say around 3.5V could be a common number.

For a 3.5V white LED, probably around 20% of the voltage drop is due to ESR. A LED (of the same base color) requiring lower voltage and therefore has a corresponding lower ESR would be more efficient.

So there is an unavoidable resistor in series with every LEDs already. If you use the number 12V, 0.6A, 20% ESR then the ESR = 12 * 20% / 0.6A = 4 ohms. These are guesses and may not reflect what you have. But say if these guesses are exactly right, then you would get 1A at 13.6V.

(The simple ESR model breaks down badly at low current).


Rereading what I wrote, I was assuming 4 x 3V. But there are only 3 LEDs in series, so the numbers do not add up.