MOSFET Leakage – Why Subthreshold Leakage Increases with Temperature

leakage-currentmosfet

As far as I know, the source-drain subthreshold leakage current in a MOSFET is given by

$$I_{ds} = e^{1.8} \frac{\mu C_{ox} W}{L} \left( \frac{kT}{q} \right)^2 e^{\frac{V_{gs} – V_t}{n \frac{kT}{q}}} \left( 1 – e^{-\frac{V_{ds}}{\frac{kT}{q}}} \right)$$

where everything should be clear except perhaps that \$n\$ is some ideality coefficient. Now it's known that the threshold voltage decreases with increasing temperature, but what I can't see is how this (or any other) effect dominate the two exponentials which involve temperature, which go as \$e^{1/T}\$ and \$(1-e^{-1/T})\$, respectively. Both of these are monotonically decreasing with temperature as far as I can tell. Perhaps someone is able to parse the \$T\$ dependencies better than I am, however.

I am looking for a mathematical argument based on the formula above.

Best Answer

As discussed in the comments, what I was missing is that \$V_{gs} - V_t < 0 \$ by definition in the subthreshold regime, so it is not correct to say there is a factor of \$e^{1/T}\$. Rather, there is a factor of \$e^{-1/T}\$. It then follows that the \$T^2\$ factor out front dominates in any and all but the most pathloogical cases (where maybe, for example, \$\mu\$ decays very strongly with \$T\$).