Electrical – How does the current draw of an IC change with supply voltage

battery-operatedintegrated-circuitldomicrocontrollerpower supply

Suppose I have a device with a number of ICs (microcontroller, logic gates, MEMs sensors, etc) which can operate from a supply voltage in the range of, say, 1.7V to 3.6V. Will the current draw be different if I power the devices at the low-end (1.7V) vs the high-end (3.6V) of the operating range? Does the answer vary depending on the specific IC or is there a general rule-of-thumb?

The datasheets I've looked at don't seem to go to this level of detail, and instead only specify current draw at a single supply voltage. I'm interested because my device is battery powered and I'm considering whether to run direct from the battery or to use a LDO regulator.

Best Answer

It depends on the IC.

For ordinary digital chips, the dominant current draw is charging and discharging capacitances to full rail voltages, so current draw is likely to be roughly proportional to voltage.

For analog chips the current is also likely to be positively correlated to the voltage, but the correlation may be weaker than with digital parts as analog designs are often based around (roughly) constant currents derived from some reference.

And very occasionally you will come across a device with a built-in switched mode power supply so the current drops as the voltage increases.