Electrical – Why higher frequency processors use more power

microprocessorpower

Why higher frequency processors use more power?

A higher frequency processor would solve the task in less time, thus increasing idle time and thus reducing power consumption.

This would compensate the fact that it uses more power.

What's wrong in my reasoning?

Best Answer

If you have a processor which can operate at two frequencies when it isn't idling, say f1 and f2, then there will be a different power consumption per frequency, as explained in other answers here.

The power consumption depends on the frequency in a non-linear fashion, so you might have:

f1 100MHz 1W
f2 200MHz 2.5W

If you have to execute 100 million instructions and the processor can do one instruction per clock cycle, you can do it at f1 or f2:

energy used at f1 = 100M instructions/100MHz / 1 (instruction/cycle) * 1W = 1J energy used at f2 = 100M instructions/200MHz / 1 (instruction/cycle) * 2.5W = 1.25J

So at f2 the execution is completed in 0.5s instead of the 1s at f1, but it took more energy.

However, there are other considerations in a computer system: for example, if you can get a disk drive into an idle state sooner because the processing has finished then the savings from the disk drive power consumption may be greater than the extra energy used in the processing. Another example: if the user can finish their work in half the time, they can shut down the computer and save on energy used to run the monitor.