I know that a simple CPU (like Intel or AMD) can consume 45-140 W and that many CPUs operate at 1.2 V, 1.25 V, etc.
So, assuming a CPU operating at 1.25 V and having TDP of 80 W… it uses 64 Amps (a lot of amps).
Why does a CPU need more than 1 A in their circuit (assuming FinFET transistors)? I know that most of the time the CPU is idling, and the 60 A are all "pulses" because the CPU has a clock, but why can't a CPU operate at 1 V and 1 A?
A small and fast FinFET transistor, for example: 14 nm operating at 3.0 GHz needs how many amps (approximately)?
Does higher current make transistors switch on and/or off more quickly?