LED Strip – Why LED Strip is Drawing More Current Than Expected

current-limitingled strippower supplyvoltage-drop

I am trying to wire some LED strips but they are getting a lot hotter than expected. I have noticed that they are also drawing much more current than expected. I am going to show what I've measure first as the product details seems to be contradictory.

led strip wiring setup

I have two lengths of 2m of 24v CCT/dual-white LED strip connected to a 150W 24v constant-voltage PSU (https://www.nationallighting.co.uk/indoor-lighting/led-strip-lighting/150w-24v-constant-voltage-led-driver-not-dimmable).

The strips are about 4.8m away (they will need to be about 10m once installed properly) and are wired using the standard flat RGB wire (4 x AWG22/0.32mm^2). Two of the wires are connected together for the positive side.
The measurements:

  1. 2.8A coming from the power supply, for both strips, so this should be 1.4A for each strip
  2. 22.6v at the start of the strip
  3. 22.2v at the end of the strip.

The product listing has seeming contradictory info: https://www.ledison-led-lights.co.uk/strip-lights/led-strip-lights/54watt-led-strip-light-5m-dual-colour-10-8w-meter.html

  1. 10.8W/m in the image and specs
  2. 1200 lm/m according to the one of the images – using 1. this implies ~111lm/W
  3. 90 lm/W in the specs – different from 2.
  4. 800 lm(WW)/900 lm(CW) in the specs – implies a total of 1700lm/m, and assuming 10.8w would yield 160 lm/W.

I did try to ring the seller before purchasing to get clarification and ask for a spec PDF, but they carried on quoting the 1200lm and 10.8W, but didn't really address points 3 and 4 after I tried to explain my numbers. I ended up buying it anyway just to see what happened.


The main/intended outcome: how to reduce heat/increase lifespan

I have mounted the strip to aluminium profiles, but the profile is still getting too warm for my liking and I would like to run them at a lower temperature/current to increase their life. Ultimately I would be driving them using PWM from a custom STM32 based circuit, but I still want to know that at 100% light they are not getting too hot. I assume that PWM alone wouldn't reduce the temperature that much as the LEDs are still driven at the same current during the cycles.

Whilst the specs are contradictory, this is the first LED strip I have seen draw more current than specified, is this even possible or do leds just draw as much as they need immediately after turning on (so no thermal runway)?


My research

I am fairly new to working with LED strips and am mostly a computer scientist with a bit of embedded software/electronics knowledge, but not really the electrical characterstics side of things.

I have learnt that too much current is ultimately what kills LEDs (and subsequently heat) as they care more for a fixed current close to their rated voltage than a fixed voltage at about the correct current (I need to find the link/forum post where I read this). Because of this, they can be driven at a lower voltage and actually have a higher efficiency when doing so.

Most questions online are about LED strips now drawing as much current as expected or being bright enough etc.

I was also expecting that the voltage drop after 4.8m of quite thin wire would have have reduced the voltage enough (and therefore the current?) enough to have reduced the heat output. I may not have a correct understanding of the relationship between current/voltage and resistance here and which one causes the other.

I have tried to look at multiple voltage drop calculators, which seem to be giving different values as well, all of which are higher than what I actually measured, so I'm not sure what knowledge I'm missing there:

  1. https://photovoltaic-software.com/solar-tools/dc-ac-drop-voltage-calculator
  2. https://www.conceptcables.com/technical/voltage-drop-calculator
  3. https://www.calculator.net/voltage-drop-calculator.html

I suspect that I will need to use a voltage step down converter and use the current limiting feature of that (I've got one of these https://www.ebay.co.uk/itm/DC-DC-BUCK-CONVERTER-STEP-DOWN-8-40V-TO-1-25-36V-8A-12A/173554594886).


The real question(s)?

I think that maybe I have a misunderstanding of how current limiting at a given voltage actually works and perhaps that is what the question should have been. I am also confused how using a buck converter to reduce the current is acceptable, and yet using a power supply that can only provide the same current is not OK.

I did also have a probably crazy idea that maybe the voltage drop could be beneficial if it reduces the current to the LEDs without needing to use a buck converter? I was thinking it would distribute the heat along the entire wire instead of needing another module that will need heatsinking, provided we don't go past the temperature/current ratings for the wire. It didn't seem to work in this case, and I suspect wouldn't at the 10m either or would be too unpredictable. Given that the voltage drop calculators didn't match my results, I may also lack some knowledge there as well.


EDIT – added PSU images, there's a few bits under the last bit of rubber, but I couldn't remove it easily:
enter image description here
enter image description here

Best Answer

You have made quite a lengthy post here and it takes a lot of time just to read it, and sometimes to keep everything you wrote sorted out (heads and tails).

I will just make some points here:

  1. You can reduce the LED strip current by reducing the voltage, but for that you need to either get an adjustable power supply (like a MeanWell PSU) that allows at least a slight voltage adjustment or modify your existing PSU (requires electronics knowledge, I could help you if you provide detailed photos of its PCB and its components.
  2. Normally, the voltage across LEDs is about 75% of the voltage drop on an LED strip, and the other 25% is across the current-reducing resistors.
    LEDs are highly non-linear devices, which means that the current through them changes drastically over a small change in voltage applied across them. An LED alone is never recommended a certain supply voltage because a slight variation in one diode from another, a change in temperature, and a small change in the supply voltage can cause variations in current from 30% to 100% or even more.
    LEDs have a specified current over some small voltage range and while the voltage applied to LEDs can be loosely controlled, the current needs to be controlled more tightly since LEDs are extremely poor at controlling or limiting it.
    That's why you pretty much always see a resistor (or some other current-controlling device) in series with an LED, because resistors are linear and predictable devices, so they provide LEDs with that linear portion necessary to somewhat limit and control the current.
    A 4% voltage increase can cause a 100% current increase in an LED.

  3. You can place about 4 regular rectifier diodes in series with your LED strip, and you will get somewhat lower voltage for your LED strip, but significantly lower current through it. Just try and you will see! You can even try adding a 5th diode to drop the strip voltage even more.
    What I prefer to do is reduce the current through LEDs down to 30-50% of their rated current in order to both reduce overheating and significantly extend their operating life.