Electronic – Why does working processors harder use more electrical power

powerprocessor

Back in the mists of time when I started coding, at least as far as I'm aware, processors all used a fixed amount of power. There was no such thing as a processor being "idle".

These days there are all sorts of technologies for reducing power usage when the processor is not very busy, mostly by dynamically reducing the clock rate.

My question is why does running at a lower clock rate use less power?

My mental picture of a processor is of a reference voltage (say 5V) representing a binary 1, and 0V representing 0. Therefore I tend to think of of a constant 5V being applied across the entire chip, with the various logic gates disconnecting this voltage when "off", meaning a constant amount of power is being used. The rate at which these gates are turned on and off seems to have no relation to the power used.

I have no doubt this is a hopelessly naive picture, but I am no electrical engineer. Can someone explain what's really going on with frequency scaling, and how it saves power. Are there any other ways that a processor uses more or less power depending on state? eg Does it use more power if more gates are open?

How are mobile / low power processors different from their desktop cousins? Are they just simpler (less transistors?), or is there some other fundamental design difference?

Best Answer

The rate at which these gates are turned on and off seems to have no relation to the power used.

This is where you are wrong. Basically, each gate is a capacitor with an incredibly tiny capacitance. Switching it on and off by "connecting" and "disconnecting" the voltage moves an incredibly tiny electrical charge into or out of the gate - that's what makes it act differently.

And a moving electrical charge is a current, which uses power. All those tiny currents from billions of gates being switched billions of times per second add up quite a bit.