Electronic – Best way to regulate 9/12v down to 5v and 3.3v

microcontrollerpowervoltage-regulator

If my source is 9 or 12v should I first us a linear regulator to step it down to 5v and from the 5v use another linear regulator to step it down to 3.3v? The 3.3v will be going to a micro-controller while the 5v will drive some 5v sensors/relays.

Current is under 500mA total, probably closer to 300. The power source for the loads (relay etc) are coming from a separate power source.

Best Answer

If those sensors/relays are external to your device, there's a fair chance that the 5V supply can get shorted and shut down. If you wish your microcontroller to work through such an event, it'd be necessary to have a separate buck converter for both 5V and 3.3V - if you care about power efficiency. If you don't mind some heat, it'd be OK to use a separate linear regulator for 5V and 3.3V, fed straight from the input source. That way it'd be easier to prevent relay transients from upsetting the microcontroller, too.

Efficiency-wise, when you use linear regulators it doesn't matter how you stack things up: cascaded or independent, all of the current consumed at any regulated voltage has to flow from the input supply. So if you consume 200mA @ 5V and 50mA @ 3.3V, it's still 250mA from the 9-12V supply.

With switching regulators, you have to take each regulator's efficiency at its actual load current and compare the independent-vs-cascaded variants. Of course for the microcontroller to stay ON in spite of control (5V) shorts, you need the regulators to be independent.

I have plenty of 3.3V microcontrollers running from linear regulators fed from 12V - sometimes it's the cheapest option when you have a powerful supply available.

Related Topic