Here is the scenario: I have a project needing 3 supply powers:
- 3.3V to run the microcontroller (max 100mA)
- 5V to run a bunch of adressable LEDs (max 8A, the most current intensive)
- 12V to supply op amps (max 15mA x 6 = 90mA for all opamps supplies)
The question is: if I want to use only one power adapter brick to power this thing, is it more logic to use a 12V power supply and step down to 5V and 3.3V? Or is it better to use a 5V power supply since most of the current is used on this voltage and simply step-up to 12V for my opamp needs?
Note: All opamps are signal level, the 12V is to have maximized headroom to preprocess my analog signals.
Best Answer
Assuming a generous 90 percent efficiency, that's still 45 Watts in for 40 Watts out, so a 12V supply just to bring it down to 5V would not be efficient compared to stepping up at 80 percent efficiency for 1.5W in 1.2W out 5 Watts vs 0.3W efficiency penalty. Even if you boost to say 14V and use a LDO for a cleaner signal for your OP amps, your still looking at 10 times worse power loss with a 12V supply.
Cost is negligible for supplies for 12V 5A or 5V 8 to 10A, but a 45 watt step down converter will cost more than a 1 watt step up converter.
So in my opinion, it makes more sense to go with your natural voltage need.