Electronic – LED lamp with more watts than LED driver

ledled-driverpowerpower supply

I bought a 40W LED lamp and a 40W LED driver to power the LED lamp however, the salesman delivered me a 36W LED driver. I called him to return the item and he said that this was the right way to power the lamp. He also said that if I power the 40W LED lamp with a 40W LED driver, this will shorten the lamp's life.

Honestly, I think he is trying to fool me. I´d searched the web looking for answers, but found nothing. Does anyone know how what must be observed to fit the right driver to the lamp?

Lamp's data:

  • Model: OR-PAN-620620-40W
  • Power: 40W
  • Voltage: DC 36-48V
  • Luminous Flux: 3200lm
  • CCT: 4000-4300 K
  • RA >80

Drive Data

  • IN: 0.37A@120Vac
  • Pout: 36W
  • Uout: 3—62V DC
  • Iout: 800mA (constant)

Best Answer

What you want to do is to match the output voltage range of the driver to the input voltage of the lamp, and match the output current of the driver to the current the lamp needs for the specified voltage to produce the specified power.

I'm going to make up an example using hardware found on the internet.

Here's a powersupply that puts out 40Watts. It supplies 700mA at between 28 to 56 Volts DC. Here's an LED that needs 2.85Volts to operate.

2.85 Volts at 700mA is 2 Watts, which is less than the maximum rating for power and current for the LED.

What you do is put 10 of those LEDs in series. You now need 28.5 Volts to make them turn on. This is with in the voltage range of the power supply.

The powersupply will apply voltage to that string of LEDs, and keep raising the voltage until 700mAmperes are flowing. At that point it will hold the voltage, and the LEDs will stay lit at a constant brightness.

If the forward voltage of the LEDs goes down (as it will when the LEDs get warm) then the power supply will lower the voltage to keep the current from going up. This also keeps the brightness constant.

Since the forward voltage goes down as temperature goes up, if you had a fixed voltage then the LEDs would draw more current. This would make them heat up and make the forward voltage drop more, draw more current, repeat until the magic smoke is released and your LED goes "poof."

That doesn't happen, though, since this is a constant current source. It keeps the voltage just high enough to allow only a specified current to flow. No overheating, no drop in the forward voltage, no "poof."

So. You must match the current, power, and voltage for your LED to the current and voltage rating of the driver.

To tell you if you have the proper driver, we would need to see that data sheets of both the driver and the LED. The needed data might be on the data plate of the driver and LED as well.


In response to added data:

  • 40 Watts at 36Volts is 1.1 Ampere maximum for your LED. The driver supplies only 800mA constant, so OK.
  • LED accepts 36V to 48V, and the driver can supply from 3V to 62V. The LED is in the range. If the lamp doesn't draw the 800mA at 48Volts, then the driver could conceivably put a higher voltage on it than the LED likes. Not likely, though, as 40Watts at 48 Volts is also around 800mA.

So, it is a close match. Driving the LED a little under current won't hurt and may help the LED last longer. Driving the LED at only 800mA will lower the power to around 30Watts, so it might be noticeably dimmer than on driven at 40Watts. You won't have anything to compare it to, though, so how could you tell?

Yep, go ahead.

Related Topic