Why does drain current stop increasing after a certain value of gate voltage is obtained by a MOSFET

mosfet

I recently conducted an experiment where I grounded the source of a MOSFET, connected a voltage source to the gate (vgs) and connected the drain in series with a voltage source (Vdd) = 10 V and a 1 kohm resistor.

I kept Vdd constant at 10 V and increased vgs from 0 to 10 V, while also monitoring vds. I calculated the drain current i_D as my output.

As I increased vgs from 0 to 10 V, I noticed that iD was 0 until vgs> 1V, which makes sense because 1 V was probably the threshold voltage of the MOSFET.

However, as vgs approached 10 V, the rate of increase drain current began to decline and looked like it was starting to flat-line. I was wondering why this happened.

I thought it might have been because of the MOSFET reaching saturation, but vds decreased as vgs increased, so
vgs-Vth > vds.

I suspect that it may have something to do with the fact that the gate voltage vgs was approaching the constant drain supply voltage Vdd, but I'm not sure why.

Can someone explain?
Thanks.

Best Answer

A supply of 10 volts across 1k means 10mA is the maximum current you can get thru the MOSFET. That is the problem with your experiment - no matter how much more you turned the juice up on the gate, the aiming point for drain current is 10mA and can not be exceded no matter how hard you try.

Of course Vds decreased - the mosfet was turning really hard-on and acting as a very small value resistor. This small resistance forms a potential divider with the 1k to make a very small voltage that is nowhere near 10V.