Electronic – 5V to 3.3V and 5V to 1.8 or 5V to 3.3V and 3.3V to 1.8V buck converters

buck

I have a 5V power supply that is converted to 3.3V using a buck converter. Now I want a 1.8V supply. What is the industry standard way to feed a 1.8V buck converter? Do we feed it from the 5V supply or from the 3.3V supply?

From an efficiency point of view, I understand that the most efficient way may depend on the loads. Still, on buck converters, the efficiency gets worse, generally, as Vout is further away from Vin. But if I feed the 1.8V converter from the 3.3V one, I already have the power loss of the first one, albeit maybe getting a small boon from the increased current draw if I am on the side of the efficiency curve that favors an increase in current.

Nevertheless, what is generally done in this case? Two buck converters converting from the same supply, or one converting from the other?

Best Answer

As you said yourself, there might be a fine balance in terms of overall efficiency. And cost. You gain a bit here, but lose a bit there. If you have curves of efficiency versus load and input voltage, you probably can calculate theoretical optimum. However if the load is variable independently, the optimization space will explode, and some use models would need to be involved. It could be a good semester project for a third-year EE student.

However, there might be one advantage of having 5->3.3>1.8V arrangement. Usually a microprocessor system requires certain power-on sequencing, and usually higher voltage needs to come up first, middle-one next, and lowest one last. In sequential arrangement this sequencing will come naturally, while in parallel arrangement you would need enable inputs and RC delays. But again, the sequencing might have certain special timing, which will be easier to implement with proper RC on enables than in sequential arrangement.

Actually, a better way is to implement complex power rails using a programmable PMIC with multiple outputs. This is the "industry standard" today.