I'm trying to implement dimming control for the lights in my kitchen hob extractor. The lights are 4x LED spotlights which are each 3v 700mA (2.1W). They are currently wired in series to a 700mA 10W constant current LED driver.
The LED dimmer module I'm trying to use is taken from an Ikea TRÅDFRI LED driver. The PWM dimming module can easily be removed and powered with an external 12v source instead of the 24v supplied by the Ikea power supply. (see here).
The 700mA 10W CC power supply puts out about 12v with the LEDs connected in series. I tried wiring the dimmer module directly into the output of the CC power supply and connecting the LEDs to the output of the dimmer, however this did not work. I think constant voltage is the only way to go with this dimmer.
I was thinking about finding a 12v constant voltage LED driver to power the dimmer and the LEDs in series, however I'd need to use a 6+ watt resistor to limit current to the LEDs, which seems a little inefficient.
Could anyone suggest a better arrangement for achieving this LED dimming setup?
Best Answer
st2000's answer is on the right track, but does not go far enough. A constant voltage drive is likely to destroy the LED. The problem is that, for a given voltage, the current drawn will increase with increasing LED temperature. In the worst case, as the LED warms up, it will draw more current. Since power is voltage times current, the power dissipated by the LED goes up. This increases the LED temperature, which causes the current to increase, etc. It's called thermal runaway. Don't do it.
And yes, you get around this by adding a series resistor with a decent voltage drop across it, so that small changes do not have big results. And yes, it's terribly inefficient. The solution is to drive the LED with constant current. You can make a switching constant-current supply which is compact and efficient - which is why commercial LED drivers do it that way.
Granted, this isn't the answer you're looking for, but reality can be cruel that way.