Electronic – IC voltages and low power design, What todo


I’m working on a low energy device consisting of a couple of IC’s.
What is confusing to me is the following, All ICs has a vcc supply range of 1.7 to 3.6V and most of the time the current reading is given @3.3v supply. Chasing the most efficient design what is actually better running at the bare minimum supply ( assuming the internal LDOs has less power to dissipate ) or run at 3.3v?

I have tried to read if there is reduced performance at lower voltage but it is not apparent. The other question that then comes of the back of the previous question, if all ICs has a supply range 1.7 to 3.6 and all IO pins is rated to 3.3v can I run some ICs at 1.8v and other at 3.3v without effecting let’s say SPI or I2C communication ?

These are the ICs MCU infocenter.nordicsemi.com/pdf/nRF52832_PS_v1.4.pdf Radio semtech.com/uploads/documents/sx1272.pdf GPS u-blox.com/en/product/sam-m8q-module. I found a new MPPT DC reg from ST this one. st.com/content/st_com/en/products/power-management/… The idea is to run the MCU from the 1.7v LDO and only when Battery charge is sufficient enable the 3.3 VDO that feeds the rest of the system. My concern would be the UART and SPI coms between GPS and Radio IC… and MCU

Any advice is appreciated

Best Answer

From looking at IC datasheets from Atmel and TI, it generally looks like you get lower current consumption at lower voltages, at least for digital/switching circuits. If we knew what specific chip you were talking about, it could help.

If the ICs are rated for 1.7-3.3V, you can certainly run them at 1.7V with no issue, although be aware of possible reduced operating ranges as glen_geek points out. An example of this would be page 346 of the Atmega88PB datasheet.

enter image description here

You can also see from table 34-4 that the scale up of current vs speed/voltage is not linear. 1MHZ at 2V is max 0.5mA (1 mW/MHz), whereas 8MHz at 5V is max 9ma (5.6 mw/MHz). This implies that, as long as you don't need a high speed for some other reason, it's better to use a low speed and voltage.

As MCG notes, level conversion is often necessary for bidirectional communication at different supply voltages, although this need not be power hungry, as there are devices available with quiescent currents in the microamps.

edited to answer comment: The more complicated MCUs like the NRF52832 are a bit of a special case, because they have internal regulators, and internally only run at one voltage (in this case 1.3V). That chip in particular has a DC/DC switching converter, which can convert at a high efficiency, so the actual input voltage is less important if this is selected (although lower is still better because the efficiency is not 100%). If using the internal LDO, then it doesn't really matter if you convert externally or internally EXCEPT I'm not 100% sure what happens with the GPIOs in terms of power.

The GPS also includes an LDO, so you dont get any gains from LDOing beforehand. This changes, however, if you use a switching converter.

This sort of shows a general principle, which is that advanced RF systems (i.e. not just an OOK SAW modulator) will only run at a specific voltage. They will therefore tend to have an integrated regulator, which is either an LDO or a DC/DC converter. If an internal LDO, you can get significant savings from externally using a DC/DC regulator. If it's an internal DC/DC, you can get some limited savings from using a higher efficiency DC/DC converter, but really not that much probably.