Electronic – which is better? 5V or 3.3V as the supply voltage

3.3v5vpower supplyvoltage

I am about to design a new microcontroller based project. I checked all components (few) and all can operate at 3.3V or 5V. the microcontroller itself can also operate at 3.3V or 5V (pic micro at only 4MHz oscillator).

Well, the question is, what are the advantages/disadvantages of using 5V as the positive supply over using 3.3V

All I could think about is that power consumption will be lower when the supply voltage is 3.3V

Thanks a lot

Best Answer

Well there is also the noise susceptibility of your circuit, which will be lower if you use 5V (you need higher noise levels to disturb a 5V circuit).

Some components have very different analog performance based on the supply voltage, if you use an op-amp the common mode voltage or the voltage difference needed from the rails might be a point to look out for and help decide which is better for you. So even if all your components work at 3.3V and 5V, your circuit might not do what you expect if you change the voltage.

@Passerby mentioned that 5V is a defacto standard due to USB. You have to be aware though, that if you need a stable supply voltage USB is a bad thing to use directly.

USB actually allows for quite a range of voltages on your device, behind a passive hub they can be as low as 4V and can go as high as 5.25V. So in that scenario you are better off using 3.3V with a voltage regulator on your board.

The maximum operating frequency of your controller might, as pointed out, also be dependent on the supply voltage. Though newer controllers run on much lower voltages internally (like 1.2-1.8V) and don't have a dependency between frequency and external voltage anymore (like the STM32 series).

And of course the power consumption, you'll gain quite a lot from going from 5V to 3.3V. It's only using 66% of the power in a static case and only (3.3^2/5^2) = 44% for the dynamic CMOS case (your actual benefit will lie somewhere in between). But if that is actually beneficial (on a global scale it is) or needed depends on your supply situation and the current you need. It could happen that if you go down to 3.3V your supply gets less efficient and the power you saved is wasted in the lower efficiency turning it around (shouldn't but could).

Another valid point mentioned by Paul in a comment below is that if you interface to some legacy devices, you might need the 5V. Some microcontrollers offer so called 5V tolerant pins, but those are only able to accept inputs up to 5V (often only in digital mode). The output is limited to VDD, so there you need a level shifter if the 3.3V are not accepted by the legacy device as high level. Which you can save if you use a native 5V controller.