Electronic – High power (up to 400W) LED DC/DC power supply design considerations

led-driverpower supplyswitch-mode-power-supplyswitching-regulator

I am designing a power supply for a high power COB LED requiring 360W @ ~75V from a variable ~60-80V DC input. I have noticed that it is difficult to find switching controller ICs that advertise being able to operate at such high power outputs, and those that do tend to be relatively complex with 40+ pins and a large external BOM (e.g. LT8210).

Previously I designed a much simpler buck LED driver circuit using the AL9910, albeit at a lower power output and single topology, that was as close as it gets to the barebones buck converter circuit, but it worked in that application without any issues. I am now trying to understand what the inherent differences are between the high and low power applications, and whether I could simply use a "basic" buck/boost switching IC with "upgraded" components or if there would be some major problems with this approach.

Some possible explanations for the differences between the ICs that have occurred to me are:

  1. Efficiency – The AL9910's peak efficiency is just above 90%, while the LT8210 appears to be closer to 98%, which is certainly much more important at higher power outputs, though not necessarily a major concern in my application.
  2. Heat – The higher power output generates more heat in the components involved, which at <90% efficiency is very significant, however I would have assumed that this does not have a major effect on the switching controller IC itself if the external components are capable of dealing with it. Although wasteful, I might prefer a larger heatsink and fan than a more complex circuit overall.
  3. Regulation – Tight voltage regulation and noise reduction.
  4. Inductive Coupling – The higher currents involved have a greater effect on signal traces.
  5. More Complex Applications – It makes sense that such high power systems would usually be found in more complex applications, where the additional circuitry for precision, safety, etc, is worth it given the higher cost, so the simple designs are uncommon.

To summarize my question:

What practical limitations or unexpected difficulties exist when increasing the power output of basic external power switch controller ICs beyond that of their recommended "typical applications", and are these limitations (in)directly addressed by more complex controller ICs?

A concrete example for this question:

The LT3757 provides a typical application with a 8V-16V input and a 24V 2A output. Vin is the only pin tied to the power portion of the circuit, the switching MOSFET is external. Is there any reason why a circuit with a 100V 5A output could not be made by adjusting the MOSFET, inductor and feedback voltage dividers, assuming the input voltage is scaled up by an equal amount?

As this is for a hobby DIY project I am not overly concerned about efficiency, safety and cost of components, but I would like to avoid designing and ordering a PCB that ends up not working at all.

Best Answer

Unfortunately with this much power to regulate in Boost/buck mode, the stability for all requirements of load step-up or down to tight error limits requires complex control of the input and output energy with slope compensation, absorption with pre-bias, soft start, OVP, OCP, OTP/

Another example is the TI LM5036

This also adds to the cost of the source supply which is an unregulated supply that requires a boost.

My suggestion is to choose the appropriate power supply design from AC to DC so that the architecture is optimized rather than compromised. This results in only a step-down or Buck mode which is far more stable and less complex. Power sources of this size commercially also need to have active PFC.

So there is no simple solution to add-on to your supply, yet the volume BOM cost can be, low as a purchased product.

Make vs Buy => BUY