Electronic – Why a lower voltage is better for modern fast CPU and other similar chips

cpupower

I started to use computers in the 1980's. As far as I remember 8bit CPUs of those times like Z80 were often powered by 5 V and used the same voltage for I/O signals. Later CPUs run at higher speeds and started to consume more power. I would expect that to deliver more power to a chip we must use either higher voltage or higher current. And since high currents normally need thick cables I would expect that CPUs would go for higher voltages to keep currents low. But the opposite is true. For example a standard Intel Pentium 4 or Core 2 Quad CPUs I use at home have 95 W TDP which means that they consume more than 100 W at power spikes. Since they run on a very low voltage around 1 V, they actually need to deliver the power using approximately 100 A. So here comes my question: Why is this preffered, why is it efficient?

Best Answer

In traditional CMOS circuits the power consumption, to first order, followed this expression:

\$ P \propto f_{CLOCK} \times C_{LOAD} \times V^2_{SUPPLY} \$

where the load capacitance was the effective capacitance of the internal wiring and transistor gate oxide. Notice that power consumption is proportional to the square of the supply voltage, so lowering the supply voltage is a powerful way to decrease power consumption. Unfortunately, just lowering the supply voltage tends to make the circuits run slower so other changes, such as scaling the transistors, are necessary to keep achieving higher clock frequencies.

As transistor scaling approached deep submicron feature sizes, say below about 250 nm, transistors stopped behaving like "traditional" CMOS and started to be more leaky. That added a term to the power equation that is proportional only to supply voltage (not voltage squared) and limited the benefit of lowering the supply voltage in order to decrease power consumption.