MOSFET – Identifying Leakage Current in MOSFET

analogcadenceintegrated-circuitleakage-currenttransistors

I'm trying to understand the leakage current characteristics of a 130nm PDK in Cadence. I set up a testbench by grounding the gate and source terminals of a nmos FET, and setting the drain voltage to a constant (200mV). For a very small size (W/L = 300n/150n), there's about 0.5pA flowing into the drain, and as I increase the drain voltage, the drain current increases linearly. I checked the Ron value through a DC simulation, and Ron is around 600G ohm (200mV/600G = 0.33pA, not too much difference).

I found many posts saying that increasing channel length reduces leakage current. So I increased the W/L to (3000n/1500n). However, the drain current now is 10 times higher.

Is this the right way to measure the leakage current? If not, what is this current I'm looking at?

Best Answer

Well, that's not surprising, because you also increased the width by a factor of 10x, which will, most likely, dominate the leakage compared to the reduction that could be provided by the length increase alone.