Distributing 700W at 5V over 10m

led strippower supply

I'm trying to build a LED panel around my living room, using 10 strips of ~10m packed with WS2812. A rough estimation of the power required with all LEDs at full brightness would be 700 W.

The flexible PCB of the LED strips of course cannot carry that amount of power, so I need to feed power at least once every meter. I'm looking for the sanest way to do that.

My current approach would be two metal pipes along the wall (that would also provide the support), and having individual panels clip onto those, and simple connectors for the signal lines. I'd directly distribute the 5V from the PSUs.

Questions:

  • is this a good approach, or is there a better way?
  • if I have two PSUs, each for 350W max, does it make sense to connect one to each end of the rail?
  • if I assume that most of the time, I'll have a load of ~20W (because brain surgery is my hobby, not my job), is there a way I could make this more efficient?
  • if I assume that I won't ever need all LEDs on at the same time, can I somehow enforce this (e.g. use three rails, if rail 1 uses more than 175W, power down rail 2 so the entire power consumption will not exceed 350W)?

Best Answer

750W of LEDs is a lot. You'd probably need a welding mask to do anything in that room. Bear that in mind - perhaps consider setting the software to never get anywhere near maximum brightness.

It's not a good idea to connect 2 power supplies in parallel to the same rail. The voltages they output won't be exactly the same and, depending on the exact topology of the power supplies, unwanted and potentially damaging things could happen.

I assume that you're proposing using the pipe as the main conductor, much like the cable track lighting shown on that page? You should break the pipe and provide an independent piece of pipe for each power supply you have. The 0V can (and should) be in common.

Monitoring of the power use could be done in a number of ways. The simplest would be a fuse or circuit breaker. It would also be the crudest method. Current monitoring is typically done using a current shunt monitor - essentially a low value resistor and a method of monitoring the voltage drop over it. As the current increases, the voltage drop increases until it's above a threshold, at which point something is done. You can get ICs which do most of it for you, or you can build a current limiting circuit with not much more than an opamp, a handful of resistors and a mosfet. I assume you're actually controlling the LEDs with a microcontroller or similar? An alternative option would be to feed the output of a current monitor into the microcontroller and to take action when the value is over a specified amount - perhaps by uniformly dimming the LEDs.

Without knowing your specific application it's hard to make recommendations. If you can reduce your requirement to a few hundred watts (which is still a huge amount of light to have indoors) then life will be a lot easier. Remember that just because an LED strip can consume a certain amount of power at full brightness doesn't mean you have to command it to.