What you want to do is to match the output voltage range of the driver to the input voltage of the lamp, and match the output current of the driver to the current the lamp needs for the specified voltage to produce the specified power.
I'm going to make up an example using hardware found on the internet.
Here's a powersupply that puts out 40Watts. It supplies 700mA at between 28 to 56 Volts DC.
Here's an LED that needs 2.85Volts to operate.
2.85 Volts at 700mA is 2 Watts, which is less than the maximum rating for power and current for the LED.
What you do is put 10 of those LEDs in series. You now need 28.5 Volts to make them turn on. This is with in the voltage range of the power supply.
The powersupply will apply voltage to that string of LEDs, and keep raising the voltage until 700mAmperes are flowing. At that point it will hold the voltage, and the LEDs will stay lit at a constant brightness.
If the forward voltage of the LEDs goes down (as it will when the LEDs get warm) then the power supply will lower the voltage to keep the current from going up. This also keeps the brightness constant.
Since the forward voltage goes down as temperature goes up, if you had a fixed voltage then the LEDs would draw more current. This would make them heat up and make the forward voltage drop more, draw more current, repeat until the magic smoke is released and your LED goes "poof."
That doesn't happen, though, since this is a constant current source. It keeps the voltage just high enough to allow only a specified current to flow. No overheating, no drop in the forward voltage, no "poof."
So. You must match the current, power, and voltage for your LED to the current and voltage rating of the driver.
To tell you if you have the proper driver, we would need to see that data sheets of both the driver and the LED. The needed data might be on the data plate of the driver and LED as well.
In response to added data:
- 40 Watts at 36Volts is 1.1 Ampere maximum for your LED. The driver supplies only 800mA constant, so OK.
- LED accepts 36V to 48V, and the driver can supply from 3V to 62V. The LED is in the range. If the lamp doesn't draw the 800mA at 48Volts, then the driver could conceivably put a higher voltage on it than the LED likes. Not likely, though, as 40Watts at 48 Volts is also around 800mA.
So, it is a close match. Driving the LED a little under current won't hurt and may help the LED last longer. Driving the LED at only 800mA will lower the power to around 30Watts, so it might be noticeably dimmer than on driven at 40Watts. You won't have anything to compare it to, though, so how could you tell?
Yep, go ahead.
This may work.. On the diode datasheet, you can see that the laser power graph only goes up to 280mA, but your constant current source is going to provide 2500mA. I would suggest you find a 250mA constant current source to correctly use this diode.
Edit 9/27: the link to the old sheet for the power supply is broke, and I was redirected to the new datasheet. anything below is strictly the new sheet.
There appears to be a 5V analog control on the module that I did not see before. Using this, It would be possible to modulate the laser diode's supply using modulation, such as a 5V PWM signal, or even a voltage divider. If you decide to go the voltage divider route, then I would use values of 1k and 9k.
Best Answer
According to the datasheet for your LED driver, it has a 3 LED minimum load. The 10-43V rating of the driver equates to 3-12 LEDs wired in series. Driving a single LED will probably cause it to lose regulation with the smoky result you witnessed.
Get two more LEDs (3, preferably - I wouldn't re-use one that had produced smoke) and the driver should work well for you.