Electronic – How to a DC18V ~ 36V LED driver work with DC3.6V LEDs without burning them out

ledled-driver

I bought some slightly more powerful leds to build a hobby project and grow some herbs. The kit has 10 LEDs that have the following spec:

Power: 3W
Voltage: DC3.6V
current::700mA

It comes with a driver:

Output Voltage: DC18V ~ 36V
Output Current: 600mA

The idea is put 10 in series and drive them from AC with that driver.

My questions is how can it work without burning them out? I know about the voltage drop, but the first LED still gets a high voltage? They are only rated ~3V.

I "tested" one LED with 9V battery and it lighted up very bright and then it was dead. Was it because of 9V or was it because it drew that much current? What am I missing here, it looks like there is some very basic thing I don't understand. I have studied the OHM law and have built some simple things, but I'm out of ideas here.

Best Answer

Your driver will output a fixed current through a string of LEDs.

Since your LEDs are rated at 700mA @ 3.6V, this driver will indeed supply adequate current for the LEDs without burning them out.

How many LEDs you can put in series on a driver depends on the drivers voltage range, in this case 18 to 36V.

So you can attach 18/3.6 to 36/3.6 or 5 to 10 LEDs to this driver.

schematic

simulate this circuit – Schematic created using CircuitLab

With 10 LEDS the output from the driver will be around 36V. That voltage is divided across each LED, so no individual diode has more than their rated voltage.

Or more accurately, when driven with 600mA each LED will generate a voltage across it, close to 3.6V, the total of those voltages will be what you measure at the output of the driver.

When you hooked up an LED directly to the 9V battery you vastly exceeded it's rated voltage and it went out with a flash of light. You basically fused it.