Electrical – Why is propagation delay a function of supply voltage

delayfpgamosfetpropagation

In the context of FPGAs, it was brought to my attention today that propagation delay varies with supply voltage. When I asked why this was so, one of the FPGA designers stated that a logic gate is essentially an amplifier (which I do agree with) and that more supply voltage means more gain (which I'm not so sure about) and more gain means faster propagation.

I didn't really buy this argument. All else being equal, simply increasing the supply voltage should not increase the small-signal gain, which is out of scope anyways since the 'amplifier' is being over-driven with digital signals.

The only thing I can think of is that a larger supply voltage suggests a higher absolute output voltage slew rate and since the FET threshold voltages don't change (I think), the thresholds are hit more quickly, leading to less delay.

Is my reasoning correct, or is there something else going on here?

Best Answer

I didn't really buy this argument. All else being equal, simply increasing the supply voltage should not increase the small-signal gain, which is out of scope anyways since the 'amplifier' is being over-driven with digital signals.

It really does. Don't think in terms of voltage gain, think in terms of output current:

When you increase the supply voltage, the gate capacitance of a FET is charged quicker, because exponentially more current flows through the involved semiconductor junctions.

That leads to a quasi linear increase of maximum switching speed in modern CMOS ICs. Sadly, you buy that with a raising energy dissipation per switching (=heat!), and thus typically in a superquadratically effort needed to keep the IC at acceptable temperatures. That's one of the main reasons (if not the main reason) there's little complex logic clocked at 10 GHz.